2,304 times last year (in just one state) so-called self driving cars, ahem, didn't self-drive, according to this report at auto connected car news.
The technology is not safe unless it is monitored by a human behind a steering wheel who can take control, Consumer Watchdog said.
Reasons for disengagement include:
[a lot of human factors -- which "AI" does not understand]
* Hardware discrepancy.
* Errors in detection.
* GPS signal issues.
* Software crash.While 50 companies are licensed to test autonomous vehicles in California, only 19 companies were required to file disengagement reports covering 2017.
(Score: 0) by Anonymous Coward on Saturday February 03 2018, @02:36AM (1 child)
There's the Therac-25 x-ray machine that killed people because of a software bug. https://en.wikipedia.org/wiki/Therac-25 [wikipedia.org]
(Score: 2) by Grishnakh on Saturday February 03 2018, @02:41AM
Thexalon didn't say that no one's ever died from a software bug, he said that no one's died from autonomous car testing, which is correct.