Stories
Slash Boxes
Comments

SoylentNews is people

posted by mrpg on Friday February 02 2018, @09:00PM   Printer-friendly
from the what-could-possibly-go-rwong dept.

2,304 times last year (in just one state) so-called self driving cars, ahem, didn't self-drive, according to this report at auto connected car news.

The technology is not safe unless it is monitored by a human behind a steering wheel who can take control, Consumer Watchdog said.

Reasons for disengagement include:
    [a lot of human factors -- which "AI" does not understand]
        * Hardware discrepancy.
        * Errors in detection.
        * GPS signal issues.
        * Software crash.

While 50 companies are licensed to test autonomous vehicles in California, only 19 companies were required to file disengagement reports covering 2017.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Saturday February 03 2018, @02:36AM (1 child)

    by Anonymous Coward on Saturday February 03 2018, @02:36AM (#632321)

    Nobody's died as a result of any of this.

    There's the Therac-25 x-ray machine that killed people because of a software bug. https://en.wikipedia.org/wiki/Therac-25 [wikipedia.org]

  • (Score: 2) by Grishnakh on Saturday February 03 2018, @02:41AM

    by Grishnakh (2831) on Saturday February 03 2018, @02:41AM (#632325)

    Thexalon didn't say that no one's ever died from a software bug, he said that no one's died from autonomous car testing, which is correct.