Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Thursday June 05 2014, @06:50PM   Printer-friendly
from the while-(true)-{spawn()} dept.

Samir Chopra at The Nation proposes that we treat algorithms as agents of the companies that deploy them. In effect, treat computer programs as people, too.

From the article:

I suggest we fit artificial agents like smart programs into a specific area of law, one a little different from that which makes corporations people, but in a similar spirit of rational regulation. We should consider programs to be legal agents--capable of information and knowledge acquisition like humans--of their corporate or governmental principals. The Google defense--your privacy was not violated because humans didn't read your e-mail--would be especially untenable if Google's and NSA's programs were regarded as their legal agents: by agency law and its central doctrine of respondeat superior (let the master answer), their agents' activities and knowledge would become those of their legal principal, and could not be disowned; the artificial automation shield between the government (the principal) and the agent (the program) would be removed.

If such a position were adopted, there could be a significant impact on the permissibility of scanning of emails for targeted advertisements or on ISP's ability to perform deep packet inspection.

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by buswolley on Thursday June 05 2014, @07:38PM

    by buswolley (848) on Thursday June 05 2014, @07:38PM (#51866)

    I think you have missed the point. What you are (jokingly) advocating is the status quo, not what the article advocates.

    --
    subicular junctures
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 1) by Angry Jesus on Thursday June 05 2014, @07:40PM

    by Angry Jesus (182) on Thursday June 05 2014, @07:40PM (#51868)

    I think this article is really a test to determine who just reads the headlines.

  • (Score: 2) by bob_super on Thursday June 05 2014, @08:39PM

    by bob_super (1357) on Thursday June 05 2014, @08:39PM (#51897)

    I am merely pointing out that in most cases, the person using a tool to commit a crime is clearly the one who gets blamed.
    When the NSA programs sort your private life into a database, somehow they get to argue that "nothing happened" until a human queries the result, because nobody is "using" the tool or something.
    Which is BS. It's not a virus or a defect, someone intentionally started the program, knowing what it does. That person should be responsible, along with his hierarchy.

    • (Score: 2) by buswolley on Thursday June 05 2014, @10:42PM

      by buswolley (848) on Thursday June 05 2014, @10:42PM (#51945)

      And the article is essentially advocating your position.

      --
      subicular junctures
    • (Score: 0) by Anonymous Coward on Friday June 06 2014, @03:30AM

      by Anonymous Coward on Friday June 06 2014, @03:30AM (#52046)

      Yes, which as you were told is the entire point of TFA. That the people writing the programs should be held accountable for what the programs do. Not get to claim that they had no knowledge because of the lack of a human interaction.

      • (Score: 2) by monster on Friday June 06 2014, @02:24PM

        by monster (1260) on Friday June 06 2014, @02:24PM (#52254) Journal

        Not really the people writing the program, but the people seeting it up for nefarious purposes, IMHO.

        A program is a tool. Unless there is a clear purpose for crime in its creation, the people responsible for its behaviour should be those that set it up in a specific, criminal setup (think guns, robbers and banks. You don't need to set "gun personhood" to find the robbers guilty. Same here)

  • (Score: 1) by meisterister on Friday June 06 2014, @12:22AM

    by meisterister (949) on Friday June 06 2014, @12:22AM (#51982) Journal

    I think that this brings up an interesting point. IANAL, but wouldn't he be able to accomplish his intended goal by incorporating the chainsaw?

    --
    (May or may not have been) Posted from my K6-2, Athlon XP, or Pentium I/II/III.