Stories
Slash Boxes
Comments

SoylentNews is people

posted by mrpg on Friday February 02 2018, @09:00PM   Printer-friendly
from the what-could-possibly-go-rwong dept.

2,304 times last year (in just one state) so-called self driving cars, ahem, didn't self-drive, according to this report at auto connected car news.

The technology is not safe unless it is monitored by a human behind a steering wheel who can take control, Consumer Watchdog said.

Reasons for disengagement include:
    [a lot of human factors -- which "AI" does not understand]
        * Hardware discrepancy.
        * Errors in detection.
        * GPS signal issues.
        * Software crash.

While 50 companies are licensed to test autonomous vehicles in California, only 19 companies were required to file disengagement reports covering 2017.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by Justin Case on Friday February 02 2018, @10:22PM

    by Justin Case (4239) on Friday February 02 2018, @10:22PM (#632204) Journal

    I would expect that in most of those ventures, the participants signed up for the risk they were taking.

    On the other hand, if your [spacecraft, satellite, mine collapse, nuke meltdown, wayward horse, non SDC car] injures an uninvolved third party, you are or should be liable. If it is criminal neglect, you should be looking at murder charges.

    Breathes there a software developer who has never launched code despite a database of known bugs? There's your criminal neglect, right there. Software doesn't usually kill people, because software does not usually have fully autonomous control of a multi-ton weapon moving waaaaaaaaay faster than you can run.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2