A little while back, I saw the following tweet:
I can print mostly. My wifi works often. The Xbox usually recognises me. Siri sometimes works. But my self driving car will be *perfect*.
The tweet has since been deleted, so I won't name the author, but it's a thought-provoking idea. At first, I agreed with it. I'm a programmer and know full well just how shoddy is 99.9% of the code we all write. The idea that I would put my life in the hands of a coder like myself is a bit worrying.
[...] The reality is that self-driving cars don't need to be perfect. They just need to be better than the alternative: human-driven cars. And that is a much lower bar, as human beings are remarkably bad at driving.
[...] Self-driving cars don't get tired. They don't get drunk. They don't get distracted by friends or a crying baby. They don't look away from the road to send a text message. They don't speed, tailgate, brake too late, forget to show a blinker, drive too fast in bad weather, run red lights, race other cars at red lights, or miss exits. Self-driving cars aren't going to be perfect, but they will be a hell of a lot better than you and me.
Related: The High-Stakes Race to Rid the World of Human Drivers
(Score: 3, Insightful) by SomeGuy on Monday January 04 2016, @12:42PM
I've said this before, but people have some fantastic, unrealistic ideas about how self-driving cars would operate. The quote in this story is just one very small example of that.
Driving on a public road is not as simple as some people think. Random obstacle like non-self driving cards, people or animals, falling tree limbs, large potholes, and road construction. Driving in random weather conditions such as hail, snow, or on roads that flash-flood. And those are just simple examples that come to mind off hand. The real world can be a very complicated, unpredictable place.
You might be able to program a self-driving car to get from point A to point B in ideal conditions. But as the old saying goes "to err is human, to really fuck things up takes a computer.".
Imagine 10000 cars driving off of a collapsed bridge in to a canyon before the error is caught. That is a simplistic example that might actually get programmed for and caught, but all it takes is one unusual edge case to create a disaster of such magnitude.
Computers can not think or reason ("Computers don't get happy, they don't get sad, they don't get tired
, they just run programs!"). So instead of recognizing and understating the world around it, self driving cars are essentially placed on invisible electronic train-tracks.
So, perhaps we may eventually get there but it will require that we vastly change the way we think about our roads and what self driving cars can actually do.
And long term, do you think any company is really going to maintain these systems? I've seen so many software products go from solid, robust, desirable products to buggy unusable unmaintained crap that is isn't even funny.
(Score: 1, Funny) by Anonymous Coward on Monday January 04 2016, @01:07PM
> Imagine 10000 cars driving off of a collapsed bridge in to a canyon before the error is caught.
Quick thought: If any of you knows anybody working on these, ask them what the car will do with the information "the car in front of me just vanished".
(Score: 2) by Nuke on Monday January 04 2016, @02:58PM
"Go faster" ?
(Score: 0) by Anonymous Coward on Monday January 04 2016, @03:58PM
...That probably *is* the correct answer.
They may complain if the lines on the road dis-appear as well though.
Not sure what a self-driving car would do at a large sink-hole that leaves the lines in tact.
(Score: 0) by Anonymous Coward on Monday January 04 2016, @05:08PM
Probably something along the line of "he is not being in the specifications. We are not being make code for that"