2,304 times last year (in just one state) so-called self driving cars, ahem, didn't self-drive, according to this report at auto connected car news.
The technology is not safe unless it is monitored by a human behind a steering wheel who can take control, Consumer Watchdog said.
Reasons for disengagement include:
[a lot of human factors -- which "AI" does not understand]
* Hardware discrepancy.
* Errors in detection.
* GPS signal issues.
* Software crash.While 50 companies are licensed to test autonomous vehicles in California, only 19 companies were required to file disengagement reports covering 2017.
(Score: 4, Interesting) by fyngyrz on Saturday February 03 2018, @08:01AM (2 children)
If I have a bad map, I'll get there anyway.
That's the standard for "a driver" as far as I'm concerned.
Yes, I agree: it's not a sensor problem.
What it is, IMHO, is an LDNLS [fyngyrz.com] stacking problem, specifically, not nearly enough in the stack. Yet.
(Score: 0) by Anonymous Coward on Saturday February 03 2018, @01:09PM (1 child)
Interesting article, thanks, although I think your timeline is a bit optimistic.
(Score: 2) by fyngyrz on Sunday February 04 2018, @12:28PM
It could very well be.
I base it somewhat upon my own efforts in the field, and largely upon the notion that technologies with significant effort being applied to them tend to advance in a nonlinear manner, a well verified fact that many others besides myself have pointed out.
Despite such efforts, it's possible that AI won't happen at all. That doesn't seem likely to me, but I'll certainly accept the idea that it's possible.