Samir Chopra at The Nation proposes that we treat algorithms as agents of the companies that deploy them. In effect, treat computer programs as people, too.
From the article:
I suggest we fit artificial agents like smart programs into a specific area of law, one a little different from that which makes corporations people, but in a similar spirit of rational regulation. We should consider programs to be legal agents--capable of information and knowledge acquisition like humans--of their corporate or governmental principals. The Google defense--your privacy was not violated because humans didn't read your e-mail--would be especially untenable if Google's and NSA's programs were regarded as their legal agents: by agency law and its central doctrine of respondeat superior (let the master answer), their agents' activities and knowledge would become those of their legal principal, and could not be disowned; the artificial automation shield between the government (the principal) and the agent (the program) would be removed.
If such a position were adopted, there could be a significant impact on the permissibility of scanning of emails for targeted advertisements or on ISP's ability to perform deep packet inspection.
(Score: 2) by buswolley on Thursday June 05 2014, @10:47PM
Well. By installing the program, the program is now your agent...unless the seller of the software did not disclose all processes therein....Actually, this should PUSH open source, as companies might find that they are liable unless disclosure of the software is made to the buyer.
subicular junctures
(Score: 2) by Sir Garlon on Friday June 06 2014, @11:34AM
For what it's worth, my intent was to condemn the veil of secrecy surrounding proprietary software: on one hand the vendor knows this is going to be used as the back-end for emergency dispatch or whatever, and then disavows that it's his problem if it fails to deliver any messages to the ambulances. The buyer would be liable for using dud software, same as if they had hired a human phone operator who left her post and was out having a cigarette when the emergency call came in. I think there is no way someone under those circumstances a company would buy software that they can't validate, and there is also no way they would simultaneously accept all liability for its safe operation. So where I think the market would go is, either you adopt open-source software and have to test and validate it in-house, or you buy proprietary software and pay through the nose for the developer to indemnify you.
[Sir Garlon] is the marvellest knight that is now living, for he destroyeth many good knights, for he goeth invisible.