A little while back, I saw the following tweet:
I can print mostly. My wifi works often. The Xbox usually recognises me. Siri sometimes works. But my self driving car will be *perfect*.
The tweet has since been deleted, so I won't name the author, but it's a thought-provoking idea. At first, I agreed with it. I'm a programmer and know full well just how shoddy is 99.9% of the code we all write. The idea that I would put my life in the hands of a coder like myself is a bit worrying.
[...] The reality is that self-driving cars don't need to be perfect. They just need to be better than the alternative: human-driven cars. And that is a much lower bar, as human beings are remarkably bad at driving.
[...] Self-driving cars don't get tired. They don't get drunk. They don't get distracted by friends or a crying baby. They don't look away from the road to send a text message. They don't speed, tailgate, brake too late, forget to show a blinker, drive too fast in bad weather, run red lights, race other cars at red lights, or miss exits. Self-driving cars aren't going to be perfect, but they will be a hell of a lot better than you and me.
Related: The High-Stakes Race to Rid the World of Human Drivers
(Score: 3, Insightful) by TheLink on Monday January 04 2016, @08:58AM
If I'm driving and I screw up - I'm liable. If the car drives itself and it screws up, the car manufacturer is liable. And you may have two parties claiming on the manufacturer - the owner of the self-driving car or his insurer could sue the manufacturer too.
And there'd probably be people trying to find flaws in your AIs that are legally exploitable - e.g. they drive or walk or decorate/paint their cars/walls in a certain legal way and that causes your self-driving car to make mistakes. Then people sue you.
So if you were a car manufacturer making self-driving cars, how much better would you want your cars to be before you would take responsibility for their mistakes? Just a bit better than the average human driver? I doubt it.
(Score: 4, Funny) by theluggage on Monday January 04 2016, @11:14AM
e.g. they drive or walk or decorate/paint their cars/walls in a certain legal way and that causes your self-driving car to make mistakes. Then people sue you.
The prosecuting attorney in the long-running Acme vs. Tesla trial died today during a jury visit to Acme's headquarters. He was pointing out the damage caused to the life-size mural of a road tunnel entrance decorating the building's front wall when a heavy vehicle suddenly emerged from the painting and failed to stop. Attempts to re-inflate the attorney with a tyre pump failed. Police are looking for a large flat-bed truck carrying anvils, weights marked '1000 tons' and grand pianos, possibly driven by a rabbit or a duck.
(Score: 2) by VLM on Monday January 04 2016, @12:54PM
And there'd probably be people ... legally exploitable ... Then people sue you.
The closest analogy to self driving cars is general aviation. Maybe even closer would be general aviation autopilots.
The way it works is some doctor with a severe case of get-home-itis tries to fly thru thru an thunderstorm, and naturally dies. Then the family sues everyone with deep pockets even tangentially related to the crash. Do you mow the lawn at the destination airport that he never reached? Do you got money? You're going to be sued. Then everyone pays out of court settlement and increases their rates. That's why a metal structure that "should cost" $50K ends up costing $1M. Or a crate engine that "should cost" $5K if it was a truck or generator engine ends up selling for $25K.
The problem of self driving cars is the liability insurance is going to maybe quintuple the cost of the car.
We don't have a functioning legal system in that very few people understand it and its too expensive to participate for most of us anyway. None the less we do have a legal system, and I assure you, anyone can sue anyone for any reason, then essentially blackmail them for an out of court for less than the cost of a typical legal defense, assuming they're not judgment proof. That, basically, is our legal system.
A self driving car is legally a non-starter.
(Score: 1) by legont on Monday January 04 2016, @06:19PM
Actually, commercial aeroplanes are self-fying for awhile already. Depending on an airline policy, the pilot may or may not take control just before a landing; the rest is always autopilot. Russians usually fly their planes, Americans and Europeans sometimes, Asians almost never. Regardless, the full legal responsibility is on the pilots.
This is the most likely way cars will go. Drivers would still go to prisons for sitting drunk inside fully automated cars.
"Wealth is the relentless enemy of understanding" - John Kenneth Galbraith.
(Score: 0) by Anonymous Coward on Monday January 04 2016, @08:04PM
I think being safer than the 90th percentile of human drivers would be adequate. That would make a huge reduction in traffic injuries and fatalities. I'm pretty sure that self driving cars can achieve that level of safety while increasing throughput, and reducing the variance in throughput, at the same time.
(Score: 0) by Anonymous Coward on Tuesday January 05 2016, @01:16PM
If you were the car manufacturer would that really be safe enough for you?
If elevators were merely safer than 90% of all human stair climbers I think elevator manufacturers would be sued and shut down.
http://www.livescience.com/17504-fatal-nyc-accident-elevators-safer-stairs.html [livescience.com]
But maybe car manufacturers might study how escalator manufacturers do it...