2,304 times last year (in just one state) so-called self driving cars, ahem, didn't self-drive, according to this report at auto connected car news.
The technology is not safe unless it is monitored by a human behind a steering wheel who can take control, Consumer Watchdog said.
Reasons for disengagement include:
[a lot of human factors -- which "AI" does not understand]
* Hardware discrepancy.
* Errors in detection.
* GPS signal issues.
* Software crash.While 50 companies are licensed to test autonomous vehicles in California, only 19 companies were required to file disengagement reports covering 2017.
(Score: 2, Insightful) by fyngyrz on Friday February 02 2018, @11:53PM (5 children)
If a supposedly autonomous vehicle can't drive safely and sanely without GPS info, I'm just going to go ahead and say it can't drive at all.
It might be put in the position of having to ask the passenger what to do, but that's the most I would accept.
Yes, it should be able to navigate to a GPS location; but no, it shouldn't need GPS to do the actual driving.
Properly implemented (IMHO of course) it should use the same cues we do: street signs, built-in maps, traffic, road hazards, etc.
Yes, these are seriously fuzzy inputs. But until they are enough, the autonomy is just a pale shadow of what a human driver can do, and therefore wholly untrustworthy.
I might let a (supposedly) autonomous vehicle park my car today. But that's about it.
And don't get me wrong - I really want this tech to mature. But as yet, that's not even on the horizon as far as I've been able to determine.
(Score: 3, Interesting) by MostCynical on Saturday February 03 2018, @12:34AM (3 children)
LIDAR, RADAR, and infra-red cameras, as well as ability to read signs are already a thing.
GPS is just.. part of the equation.
The quality of the *maps* is the problem.
https://twitter.com/elonmusk/status/789020841489018880?lang=en [twitter.com]
"I guess once you start doubting, there's no end to it." -Batou, Ghost in the Shell: Stand Alone Complex
(Score: 4, Interesting) by fyngyrz on Saturday February 03 2018, @08:01AM (2 children)
If I have a bad map, I'll get there anyway.
That's the standard for "a driver" as far as I'm concerned.
Yes, I agree: it's not a sensor problem.
What it is, IMHO, is an LDNLS [fyngyrz.com] stacking problem, specifically, not nearly enough in the stack. Yet.
(Score: 0) by Anonymous Coward on Saturday February 03 2018, @01:09PM (1 child)
Interesting article, thanks, although I think your timeline is a bit optimistic.
(Score: 2) by fyngyrz on Sunday February 04 2018, @12:28PM
It could very well be.
I base it somewhat upon my own efforts in the field, and largely upon the notion that technologies with significant effort being applied to them tend to advance in a nonlinear manner, a well verified fact that many others besides myself have pointed out.
Despite such efforts, it's possible that AI won't happen at all. That doesn't seem likely to me, but I'll certainly accept the idea that it's possible.
(Score: 2) by Thexalon on Saturday February 03 2018, @05:20PM
I'm guessing the primary use of the GPS info is to figure out where it's supposed to be trying to go. The rest of the programming can keep the car in its lane at a reasonable speed, but can't always tell you if you're trying to take the next exit ramp or turn left at the next stop sign (especially if the streets aren't well-marked, a fairly common occurrence).
The only thing that stops a bad guy with a compiler is a good guy with a compiler.