Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 17 submissions in the queue.
posted by martyb on Thursday June 05 2014, @06:50PM   Printer-friendly
from the while-(true)-{spawn()} dept.

Samir Chopra at The Nation proposes that we treat algorithms as agents of the companies that deploy them. In effect, treat computer programs as people, too.

From the article:

I suggest we fit artificial agents like smart programs into a specific area of law, one a little different from that which makes corporations people, but in a similar spirit of rational regulation. We should consider programs to be legal agents--capable of information and knowledge acquisition like humans--of their corporate or governmental principals. The Google defense--your privacy was not violated because humans didn't read your e-mail--would be especially untenable if Google's and NSA's programs were regarded as their legal agents: by agency law and its central doctrine of respondeat superior (let the master answer), their agents' activities and knowledge would become those of their legal principal, and could not be disowned; the artificial automation shield between the government (the principal) and the agent (the program) would be removed.

If such a position were adopted, there could be a significant impact on the permissibility of scanning of emails for targeted advertisements or on ISP's ability to perform deep packet inspection.

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Insightful) by Bytram on Thursday June 05 2014, @07:38PM

    by Bytram (4043) on Thursday June 05 2014, @07:38PM (#51864) Journal

    Maybe not in a personhood sort of way, but something about this strikes me as very interesting.

    The operator of a car that strikes a pedestrian is responsible for the injury, not the car itself. The operator of a gun that is fired at a person is responsible, as well. Why should not the operator of a computer (program) that causes harm be responsible, too?

    Starting Score:    1  point
    Moderation   +1  
       Insightful=1, Total=1
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 3, Insightful) by jimshatt on Thursday June 05 2014, @07:50PM

    by jimshatt (978) on Thursday June 05 2014, @07:50PM (#51874) Journal
    But what Google claims, is that there is no harm done *because* it is done by a computer program. No human at Google has read your email, thus no harm done. I disagree (of course) because the program acts upon the information in the mails it reads.
    OTOH, suppose you have a rash or STD or whatever you're ashamed for. You can go to your doctor, but he's a friend of the family so you don't really want him to know. The alternative is some smart vending machine that scans your rash and gives you medication. This machine is owned by the same doctor, and he gets the dollar bills you put in it. The doctor doesn't invade your privacy because it's only the computer program that knows.
  • (Score: 3, Interesting) by bucc5062 on Thursday June 05 2014, @08:02PM

    by bucc5062 (699) on Thursday June 05 2014, @08:02PM (#51879)

    WHen its a program, whom do you then hold accountable...

    The Original programmer(s)
    The maintenance programmers
    The operators who may run/set up program operations
    The Project Manager
    The CEO

    With a car you have one actor in control
    With a gun your have one brain pulling the trigger

    With a program you have a chorus which is why it is hard to criminally charge a company/corporation for negligence since the blame can be diluted to the point of being worthless. I get where the author is coming from, but I would think this woudl open up a can of worms, not help make a tasty meal.

    --
    The more things change, the more they look the same
    • (Score: 2) by maxwell demon on Thursday June 05 2014, @11:14PM

      by maxwell demon (1608) on Thursday June 05 2014, @11:14PM (#51960) Journal

      When in doubt, the one who decided that the program is ready to be applied.

      --
      The Tao of math: The numbers you can count are not the real numbers.
  • (Score: 1) by slartibartfastatp on Thursday June 05 2014, @09:44PM

    by slartibartfastatp (588) on Thursday June 05 2014, @09:44PM (#51924) Journal

    And kinda answers the question posed in a story here some time ago: when the algorithm of your (future) automated car decides to crash based on the type of the car involved, who can be held accountable? obviously not the "driver".