2,304 times last year (in just one state) so-called self driving cars, ahem, didn't self-drive, according to this report at auto connected car news.
The technology is not safe unless it is monitored by a human behind a steering wheel who can take control, Consumer Watchdog said.
Reasons for disengagement include:
[a lot of human factors -- which "AI" does not understand]
* Hardware discrepancy.
* Errors in detection.
* GPS signal issues.
* Software crash.While 50 companies are licensed to test autonomous vehicles in California, only 19 companies were required to file disengagement reports covering 2017.
(Score: 2) by Thexalon on Saturday February 03 2018, @05:20PM
I'm guessing the primary use of the GPS info is to figure out where it's supposed to be trying to go. The rest of the programming can keep the car in its lane at a reasonable speed, but can't always tell you if you're trying to take the next exit ramp or turn left at the next stop sign (especially if the streets aren't well-marked, a fairly common occurrence).
The only thing that stops a bad guy with a compiler is a good guy with a compiler.