Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Tuesday February 05 2019, @08:32AM   Printer-friendly
from the https://www.xkcd.com/1559/ dept.

Five Reasons are provided by the U.K. DailyMail on why we won't be seeing autonomous cars take over any time soon.

SNOW AND WEATHER

[...] Heavy snow, rain, fog and sandstorms can obstruct the view of cameras. Light beams sent out by laser sensors can bounce off snowflakes and think they are obstacles.

Radar can see through the weather, but it doesn't show the shape of an object needed for computers to figure out what it is.

[...] PAVEMENT LINES AND CURBS

Across the globe, roadway marking lines are different, or they may not even exist. Lane lines aren't standardized, so vehicles have to learn how to drive differently in each city.

[...] DEALING WITH HUMAN DRIVERS

For many years, autonomous vehicles will have to deal with humans who don't always play by the rules.

[...] LEFT TURNS

Deciding when to turn left in front of oncoming traffic without a green arrow is one of the more difficult tasks for human drivers and one that causes many crashes. Autonomous vehicles have the same trouble.

[...] CONSUMER ACCEPTANCE

The fatal Uber crash near Phoenix last year did more than push the pause button on testing.

It also rattled consumers who someday will be asked to ride in self-driving vehicles.

Surveys taken after the Uber crash showed that drivers are reluctant to give up control to a computer.

I fully intend to spend my twilight years relaxing in relative safety while the car drives me around; I'm gonna be torqued if they take too long.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by theluggage on Tuesday February 05 2019, @01:41PM

    by theluggage (1797) on Tuesday February 05 2019, @01:41PM (#796652)

    Deciding when to turn left in front of oncoming traffic without a green arrow is one of the more difficult tasks for human drivers and one that causes many crashes. Autonomous vehicles have the same trouble.

    One related question people don't seem to be asking is: what happens if 30% of the cars on the road suddenly start driving like they're taking their drivers' exam?

    If you regularly drive then I'm sure you'll be able to think of a few spots where - if you want to pull out sometime this century - you have to take a calculated risk or drive 'assertively'. I'm not talking about stupid, reckless risks, just the sort of "that driver is going to yield to me - he just doesn't know it yet" or "OK, there could be a crack-addled texting driver about to appear around that blind corner at 70mph, but I can't see them and however long I wait here I'm not going to develop X-ray vision" type of decision that happens in a world where the road layout wasn't designed by an omniscient genius starting from a blank canvas.

    Now, in some cases, more people sticking to the rules will make things flow more smoothly, but the real advantage of that won't cut in until everybody is in an autonomous car. Meanwhile, how many road junctions are already working at 99.8% capacity, and what would happen if the average wait increased by 20% because autonomous cars were erring more on the side of caution?

    Interesting thing is, if other makers follow the Tesla models and start releasing cars that are a software-update away from full autonomy, millions of self-driving cars could materialise overnight. Don't make any travel plans for that day.

    Theres a side issue, of course, which is how you sell drivers on cars that drive around at 5mph under the posted limit, slow to 20mph for every traffic light just in case it changes and refuse to start if the windscreen washer is less than 20% full, because the manufacturers' lawyers insist on an ultra-conservative reading of the highway code to avoid liability (alternatively, are you going to get behind the no-wheel of self-driving car unless the manufacture indemnifies you against accidents?) If you think that self-driving cars are actually going to be safer than human drivers, bear in mind that part of that will inevitably come from driving more cautiously.

    Of course, the problem solves itself once all the cars are autonomous and talking to each other, all the road signs can be ripped up and dense streams of fast-moving traffic just flow in a perfect technological ballet... but that doesn't happen with a mix of manual and autonomous vehicles (oh, and also needs cars that are 100% bug free, mechanically reliable, follow the same standards etc. otherwise the first blow-out will turn the freeway into a meat-grinder - basically not gonna happen while people still want to own private cars - and if you overcome the desire for private ownership there's this much simpler solution for making transport more efficient called 'busses and trains').

    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3