A little while back, I saw the following tweet:
I can print mostly. My wifi works often. The Xbox usually recognises me. Siri sometimes works. But my self driving car will be *perfect*.
The tweet has since been deleted, so I won't name the author, but it's a thought-provoking idea. At first, I agreed with it. I'm a programmer and know full well just how shoddy is 99.9% of the code we all write. The idea that I would put my life in the hands of a coder like myself is a bit worrying.
[...] The reality is that self-driving cars don't need to be perfect. They just need to be better than the alternative: human-driven cars. And that is a much lower bar, as human beings are remarkably bad at driving.
[...] Self-driving cars don't get tired. They don't get drunk. They don't get distracted by friends or a crying baby. They don't look away from the road to send a text message. They don't speed, tailgate, brake too late, forget to show a blinker, drive too fast in bad weather, run red lights, race other cars at red lights, or miss exits. Self-driving cars aren't going to be perfect, but they will be a hell of a lot better than you and me.
Related: The High-Stakes Race to Rid the World of Human Drivers
(Score: 5, Insightful) by khchung on Monday January 04 2016, @09:09AM
Self driving cars need to fulfil one criteria, and one criteria only: There must never be an incident where it would be reasonable to say that a human driver could and would have avoided the issue - as long as we get to that point everything else is moot.
Name one other tool that replaced a human that have this property.
By this criteria, we would still be living in the pre-industrial world. Almost every industrial accident that involved a machine, would not have happened if the machine were a human wielding a tool. (Although another set of accidents might have happened instead, but that was not the criteria above).
We don't need autonomous cars to be better than humans in *every way*, we just need it to be significantly better *on the average*. E.g., if passengers in autonomous cars have injury rates of only 10% of human driven cars (i.e. 90% reduction), it would be stupid not to use one, even though those 10% might have been better off if it were human driven at the time of accident.
This is the same non-reason for avoiding vaccines, because you cannot prove everyone getting a vaccine would be better off than not. You can only prove that *on the whole* the population would be better.
(Score: 3, Insightful) by Jiro on Monday January 04 2016, @10:03AM
Asking to name another tool that has this property is meaningless because most tools don't replace the human's ability to make decisions. The closest you get is a tool that does the job with a lot of changes other than replacing the human's decision-making ability, with the loss of human decision ability incidental to the change in the job. And occasionally you get a tool that augment's the human's senses but still requires a human to look at the new information.
(Score: 3, Insightful) by khchung on Monday January 04 2016, @03:11PM
most tools don't replace the human's ability to make decisions.
There are plenty of tools that replace your ability to make decisions, you are just so used to it you didn't notice.
Both the escalator and elevator removed your ability to stop moving in the middle of climbing a flight of stairs. Both will cause accidents that would not have happened if you walked up the stairs instead.
Meat grinders (or any automated cutting tool) replaced your ability to decide to stop/alter your cut, and there had been plenty of accidents involving lost fingers with those tools.
The printing press removed a scribes ability to decide what to write in mid-page, and I am sure there had been accidents with the printing press that would not have happened if the pages were hand-copied.
Email's ability to reply-all/send-to-distribution-list removed your ability to make individual decision on each recipient, and there had been plenty of fiasco due to email mistakenly sent to unintended recipient, which would have been avoided if you have to hand-write the address of each on an envelop.
Yet humanity accepted these risk for progress, the same would happen to autonomous cars. One big red button on the dashboard that said "Emergency STOP" gave you as much control as your elevator.
(Score: 0) by Anonymous Coward on Monday January 04 2016, @05:47PM
Wow, what a lovely set of examples.
Escalator goes up, escalator goes down. Escalator goes up, escalator goes down.
Meat grinder goes on, meat grinder goes off, meat grinder goes on, meat grinder goes off. Let's take a ride in the meat grinder.
A modern printing press is much more complex, but per your example the human buying or using the printing press already made the decision they want all pages to be identical well in advance.
E-mail distribution lists? Yet you don't want it to go to a few people on that list? You are using the tool wrong. No decisions there other than a choice to avoid learning how to use it.
So the most complex thing there is the printing press. A modern one has sensors to monitor ink levels, rotation speeds, paper inputs, paper outputs, piles of sensors to make sure computer controllers parts move where they should, etc. But these are still very narrow and well defined use cases. If a rat climbs in to the system and breaks something, you accept you have already made the decision that everything should stop so you can call a repair man.
But out on a road all kinds of wild unexpected shit can happen.
How would a computer handle this: Oh, the car in front is slowing down a bit (detectable). I wonder why (it won't) , Oh, they just ran over something that was on the road (probably not detectable unless the cars are talking together and the one in front send out an alert). Oh, shit! There is a ladder in the middle of the highway! (how fast is your object recognition? do you even have object recognition?). Decision time:! Slam on the breaks while hitting it anyway and come to a complete stop in the middle of a busy highway. Keep going exactly straight and run over the ladder, probably damaging the tire and possibly spinning out of control killing people, Swerve left or right and smash in to another car, or (clever idea) move over just enough so the ladder goes directly under the car and hopefully only damages the undercarriage.
That is the kind of decision a self-driving car will have to make. Yes, we may have to accept it will default to a more blunt solution (probably coming to a complete stop), but it will be a much more important decision because human lives and large amounts of repair money are on the line.
Or we will have to re-think what a "car" is and what we should expect from it. After all, a modern printing press is not a robot arm moving around a feather quill.
(Score: 3, Interesting) by vux984 on Monday January 04 2016, @09:31PM
How would a computer handle this
How would a human handle it?
They can't see in front of the car in front of them. So there first glimpse of the ladder will be after the other car ran over it and sent it skittering at you. So what does a human do?
Slam on the brakes? Just ram it full speed? Swerve into other lanes and hope they are empty?
Move over just enough so the ladder goes directly under car? Because humans are so practiced at lining up to run over moving debris at highspeeds? Seriously... a truck in front of me once sent a rock the size of a melon bouncing along the road at me at highway speeds, I slammed on the brakes, but a 'collision' was inevitable, I tried to line up so that it would at least go under the car, but i still hit it with one of the wheels and blew out a tire anyway. Suffice it to say, succeeding at that would have been more luck than brains. For man or machine.
Really, the best a human is going to do is slam on the brakes and try to minimize the speed of the impact with the object; and if there is time to check mirrors etc perhaps try and change lanes around it (while still braking in case hitting it is inevitable). The car really isn't going to do any worse than a human here, and may be better equipped to deal with the blowout / skidding post impact if it comes to that.
If the automated car comes to a safe stop great. That's no worse than a human would do.
The real challenge is for the automated car BEHIND the big rig that just hit the ladder, blew out, and spun to a stop. Because what is it going to do AFTER it comes to its safe stop? The rig in front of it is sideways, blocking both lanes and its going to be there for an hour or three until a tow truck arrives.
OR does it and the other automated cars just sit there for hours until the tow truck comes to remove the obstacle in front of it?
While the human drivers all drove into shoulder / ditch to get around it. Or maybe the police closed one of the oncoming lanes, and is directing traffic trough some service crossings to detour traffic. Of course your automatic car needs to drive in reverse 1/2 mile (on a freeway) to get to the service entrance which it would normally be illegal to even use, where it would cross to an oncoming traffic lane... on a freeway. (Sure its closed... and the police are directing traffic...and its a proper detour route... but can the automated car figure that out though??)
(Score: 3, Insightful) by quadrox on Monday January 04 2016, @10:19AM
As another poster pointed out, this requirement is only relevant for tools/machines that are fully automated. Humans like to be in control, as long as there is a human involved in the process people think everything is fine. But if you take away the last bit of human control, people will worry about safety, a lot more than is perhaps reasonable. Therefore the requirement for complete automation of anything is that it never performs worse than a human would. If you can show that, then people will be ok with it.
(Score: 0) by Anonymous Coward on Monday January 04 2016, @01:04PM
As others have mentioned, the machines you are talking about still have humans pressing the buttons (even if only the big red one). Even auto-pilots have real pilots on watch at all times. And unlike the "driver still needs to be able to take over" idea for self driving cars, they have A) several seconds longer to do so, due to traffic separation, and B) two of them, so that they won't both be reading the newspaper at the same time.
The ones that don't have people pressing the buttons tend to have big fences with locking gates that ensure that the whole thing will shut down if anyone tries to enter the working area.
That's not going to work with self-driving cars.