Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Sunday July 26 2015, @10:07PM   Printer-friendly
from the me-and-my-mechanical-buddy dept.

Slate and University of Washington have recent articles discussing robotics and the issue of how hard they say it is to even begin to define the nature and scope of robotics, let alone something like liability resulting from harm. They say:

Robots display increasingly emergent behavior...in the sense of wondrous complexity created by simple rules and interactions—permitting the technology to accomplish both useful and unfortunate tasks in unexpected ways. And robots, more so than any technology in history, feel to us like social actors—a tendency so strong that soldiers sometimes jeopardize themselves [livescience.com] to preserve the "lives" of military robots in the field.

[Robotics] combines, arguably for the first time, the promiscuity of information with the capacity to do physical harm. Robotic systems accomplish tasks in ways that cannot be anticipated in advance, and robots increasingly blur the line between person and instrument. Today, software can touch you, which may force courts and regulators to strike a new balance.

This seems like calmly worded yet unnecessary hype that is severely premature. Why not simply hold manufactures and owners responsible like we do now? I suppose this ignores the possibility of eventual development of true AI, where such an entity might be 'a person' who could be sued or thrown in jail. If it's an AI iteration that is only as smart as a dog, then the dog's owner pays if it bites.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Monday July 27 2015, @06:49PM

    by Anonymous Coward on Monday July 27 2015, @06:49PM (#214462)

    Well, I think that those disclaimers should be enough to push the responsibility to those that aren't allowed to push it any further.

    That is, car manufacturers aren't (or at least shouldn't be) able to push that liability off, since they are incorporating it into a massive device that could easily kill someone. Therefore, any software they incorporate into the device is their responsibility regardless of who wrote it, unless the software writers are willing to accept the liability themselves (possibly for an increase in pay for the software).