Samir Chopra at The Nation proposes that we treat algorithms as agents of the companies that deploy them. In effect, treat computer programs as people, too.
From the article:
I suggest we fit artificial agents like smart programs into a specific area of law, one a little different from that which makes corporations people, but in a similar spirit of rational regulation. We should consider programs to be legal agents--capable of information and knowledge acquisition like humans--of their corporate or governmental principals. The Google defense--your privacy was not violated because humans didn't read your e-mail--would be especially untenable if Google's and NSA's programs were regarded as their legal agents: by agency law and its central doctrine of respondeat superior (let the master answer), their agents' activities and knowledge would become those of their legal principal, and could not be disowned; the artificial automation shield between the government (the principal) and the agent (the program) would be removed.
If such a position were adopted, there could be a significant impact on the permissibility of scanning of emails for targeted advertisements or on ISP's ability to perform deep packet inspection.
(Score: 3, Insightful) by Bytram on Thursday June 05 2014, @07:38PM
Maybe not in a personhood sort of way, but something about this strikes me as very interesting.
The operator of a car that strikes a pedestrian is responsible for the injury, not the car itself. The operator of a gun that is fired at a person is responsible, as well. Why should not the operator of a computer (program) that causes harm be responsible, too?
(Score: 3, Insightful) by jimshatt on Thursday June 05 2014, @07:50PM
OTOH, suppose you have a rash or STD or whatever you're ashamed for. You can go to your doctor, but he's a friend of the family so you don't really want him to know. The alternative is some smart vending machine that scans your rash and gives you medication. This machine is owned by the same doctor, and he gets the dollar bills you put in it. The doctor doesn't invade your privacy because it's only the computer program that knows.
(Score: 3, Interesting) by bucc5062 on Thursday June 05 2014, @08:02PM
WHen its a program, whom do you then hold accountable...
The Original programmer(s)
The maintenance programmers
The operators who may run/set up program operations
The Project Manager
The CEO
With a car you have one actor in control
With a gun your have one brain pulling the trigger
With a program you have a chorus which is why it is hard to criminally charge a company/corporation for negligence since the blame can be diluted to the point of being worthless. I get where the author is coming from, but I would think this woudl open up a can of worms, not help make a tasty meal.
The more things change, the more they look the same
(Score: 2) by maxwell demon on Thursday June 05 2014, @11:14PM
When in doubt, the one who decided that the program is ready to be applied.
The Tao of math: The numbers you can count are not the real numbers.
(Score: 1) by slartibartfastatp on Thursday June 05 2014, @09:44PM
And kinda answers the question posed in a story here some time ago: when the algorithm of your (future) automated car decides to crash based on the type of the car involved, who can be held accountable? obviously not the "driver".