2,304 times last year (in just one state) so-called self driving cars, ahem, didn't self-drive, according to this report at auto connected car news.
The technology is not safe unless it is monitored by a human behind a steering wheel who can take control, Consumer Watchdog said.
Reasons for disengagement include:
[a lot of human factors -- which "AI" does not understand]
* Hardware discrepancy.
* Errors in detection.
* GPS signal issues.
* Software crash.While 50 companies are licensed to test autonomous vehicles in California, only 19 companies were required to file disengagement reports covering 2017.
(Score: 2) by Justin Case on Friday February 02 2018, @10:22PM
I would expect that in most of those ventures, the participants signed up for the risk they were taking.
On the other hand, if your [spacecraft, satellite, mine collapse, nuke meltdown, wayward horse, non SDC car] injures an uninvolved third party, you are or should be liable. If it is criminal neglect, you should be looking at murder charges.
Breathes there a software developer who has never launched code despite a database of known bugs? There's your criminal neglect, right there. Software doesn't usually kill people, because software does not usually have fully autonomous control of a multi-ton weapon moving waaaaaaaaay faster than you can run.