Samir Chopra at The Nation proposes that we treat algorithms as agents of the companies that deploy them. In effect, treat computer programs as people, too.
From the article:
I suggest we fit artificial agents like smart programs into a specific area of law, one a little different from that which makes corporations people, but in a similar spirit of rational regulation. We should consider programs to be legal agents--capable of information and knowledge acquisition like humans--of their corporate or governmental principals. The Google defense--your privacy was not violated because humans didn't read your e-mail--would be especially untenable if Google's and NSA's programs were regarded as their legal agents: by agency law and its central doctrine of respondeat superior (let the master answer), their agents' activities and knowledge would become those of their legal principal, and could not be disowned; the artificial automation shield between the government (the principal) and the agent (the program) would be removed.
If such a position were adopted, there could be a significant impact on the permissibility of scanning of emails for targeted advertisements or on ISP's ability to perform deep packet inspection.
(Score: 2) by buswolley on Thursday June 05 2014, @07:38PM
I think you have missed the point. What you are (jokingly) advocating is the status quo, not what the article advocates.
subicular junctures
(Score: 1) by Angry Jesus on Thursday June 05 2014, @07:40PM
I think this article is really a test to determine who just reads the headlines.
(Score: 2) by bob_super on Thursday June 05 2014, @08:39PM
I am merely pointing out that in most cases, the person using a tool to commit a crime is clearly the one who gets blamed.
When the NSA programs sort your private life into a database, somehow they get to argue that "nothing happened" until a human queries the result, because nobody is "using" the tool or something.
Which is BS. It's not a virus or a defect, someone intentionally started the program, knowing what it does. That person should be responsible, along with his hierarchy.
(Score: 2) by buswolley on Thursday June 05 2014, @10:42PM
And the article is essentially advocating your position.
subicular junctures
(Score: 0) by Anonymous Coward on Friday June 06 2014, @03:30AM
Yes, which as you were told is the entire point of TFA. That the people writing the programs should be held accountable for what the programs do. Not get to claim that they had no knowledge because of the lack of a human interaction.
(Score: 2) by monster on Friday June 06 2014, @02:24PM
Not really the people writing the program, but the people seeting it up for nefarious purposes, IMHO.
A program is a tool. Unless there is a clear purpose for crime in its creation, the people responsible for its behaviour should be those that set it up in a specific, criminal setup (think guns, robbers and banks. You don't need to set "gun personhood" to find the robbers guilty. Same here)
(Score: 1) by meisterister on Friday June 06 2014, @12:22AM
I think that this brings up an interesting point. IANAL, but wouldn't he be able to accomplish his intended goal by incorporating the chainsaw?
(May or may not have been) Posted from my K6-2, Athlon XP, or Pentium I/II/III.