Stories
Slash Boxes
Comments

SoylentNews is people

posted by LaminatorX on Monday May 11 2015, @06:39PM   Printer-friendly
from the better-mousetrap dept.

According to an article by the AP - via an ad-free site several of the self driving cars licensed to drive in California have been involved in accidents.

Most are slow speed accidents, apparently with no injuries.

Four of the nearly 50 self-driving cars now rolling around California have gotten into accidents since September, when the state began issuing permits for companies to test them on public roads. Two accidents happened while the cars were in control; in the other two, the person who still must be behind the wheel was driving, a person familiar with the accident reports told The Associated Press.

Three involved Lexus SUVs that Google Inc. outfitted with sensors and computing power in its aggressive effort to develop "autonomous driving," a goal the tech giant shares with traditional automakers. The parts supplier Delphi Automotive had the other accident with one of its two test vehicles. Google and Delphi said their cars were not at fault in any accidents, which the companies said were minor.

Neither the companies involved, nor the State of California will release details of these accidents, which rankles some critics.

Four accidents involving these 50 cars in 8 months may seem a little high. Google's 23 cars have driven 140,000 miles in that time and racked up 3 accidents all by them selves. That is an order of magnitude higher than the National Transportation Safety Board's figures of 0.3 per 100,000 for non injury accidents. However the NTSB doesn't collect all fender bender accidents.

The article says that none of the other states that permit self driving cars have any record of accidents.

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Insightful) by SecurityGuy on Monday May 11 2015, @07:16PM

    by SecurityGuy (1453) on Monday May 11 2015, @07:16PM (#181592)

    We should be happy with self driving cars as soon as they're demonstrably safer than we are. Human drivers are nowhere near accident free, and a small subset of them are downright unsafe (speeding, drunk, texting, etc). If you're going to be speeding while drunk and texting, I'd much rather you be in one of Google's cars and take my chances with them.

    Starting Score:    1  point
    Moderation   +3  
       Insightful=3, Total=3
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 4, Insightful) by snick on Monday May 11 2015, @07:49PM

    by snick (1408) on Monday May 11 2015, @07:49PM (#181606)

    We need to get over the concept of "standby driver" first. Having a person who is normally disengaged from the driving task take over when the situation becomes too complex for the computer is a recipe for disaster. If you aren't actively driving, you will not be engaged with the environment around the vehicle, and the amount of time between the computer deciding to bail, and the bad thing that scared the computer happening will usually be about enough for the "driver" look up. No time to asses the situation, and none to react.

    Autonomous cars aren't nearly as dangerous as semi-autonomous cars.

    • (Score: 2, Interesting) by Anonymous Coward on Monday May 11 2015, @08:43PM

      by Anonymous Coward on Monday May 11 2015, @08:43PM (#181633)

      I agree. A number of years ago I was driving on a major interstate at the start of a holiday weekend. I saw a vehicle a few hundred yards in front of me that entered the median. My initial thought was it was an unmarked cop car turning around to go after someone heading the opposite direction. About a second later I realized he was going much too fast to be turning around, and I mumbled something to the effect of "What the hell..." That was just enough time for my sister to look up and see the car plow into another car in the opposing lane. Traffic typically moves about 75-80 mph in that area, and the driver that crossed the median was killed. By the time I realized what I had seen the accident was a couple hundred yards behind me.

      Pretend this was an automated car. Five hours into a trip an alarm goes off. The time my sister had to look up and see the accident would likely be all the time you have. So you have a full second or two to hear the alarm, stop whatever you am doing, check your surroundings, determine the threat(s), brake and/or take evasive action. Under those conditions I would trust an automated car to make better and quicker decisions than me.

      • (Score: 3, Disagree) by Reziac on Tuesday May 12 2015, @03:33AM

        by Reziac (2489) on Tuesday May 12 2015, @03:33AM (#181781) Homepage

        I wouldn't. I've found that I react automatically and appropriately to a hazard situation before I'm consciously aware of doing so -- often before the hazard fully manifests. Thus I've avoided a number of accidents. Judging by what I've seen around me in many years driving in Los Angeles and across the western U.S., most people do the same. I'd say this anticipatory defensive driving is a standard feature; as Frojack points out, the stats inform us that most people *are* good drivers.

        --
        And there is no Alkibiades to come back and save us from ourselves.
        • (Score: 3, Insightful) by snick on Tuesday May 12 2015, @01:01PM

          by snick (1408) on Tuesday May 12 2015, @01:01PM (#181913)

          Sure, folks react automatically to hazards. But that is because they are already actively driving, and are immersed in the vehicle's situation when the hazard presents itself. Even if you don't consciously think about it, you are aware of what is in front of, behind, and on both sides of the car at all times, and have a general idea of all the relative velocities. (or at least you ought to)

          A standby driver isn't going to be immersed in the vehicle's situation. s/he will be having a conversation with the backseat passenger, or facebooking on his/her phone when the hazard presents. There is no way that you can put someone in the driver's seat, give them nothing to do, but expect them to be alert and engaged.

          Handing off driving responsibilities from the computer to the "driver" in a crisis is not like having a driver handle a crisis. It is like suddenly handing the controls to someone in the passenger seat who was taking a nap.

          • (Score: 2) by Reziac on Tuesday May 12 2015, @01:36PM

            by Reziac (2489) on Tuesday May 12 2015, @01:36PM (#181928) Homepage

            Exactly. Driving is not a part-time job; it's a state of awareness one gets into and stays in for the duration, which is why most of us do it well. One driver in a thousand might react appropriately when 'awakened' from their Facebook nap; the rest are more likely to panic and react badly or far too late. [Side note: I found that playing DOOM, where you can get ambushed from any angle at any moment, made me a better driver, more aware in all directions.]

            I think self-driving cars are likely to be all or nothing; if it's "some" there'll be too many vehicles acting in ways live drivers don't expect and can't properly anticipate, as the self-drivers behave in situationally-wrong ways (failing to anticipate from those tiny cues that live drivers catch). I expect our eventual adjustment will be to stay the hell away from any self-driving car we see on the road, or to confine their use to HOV-type lanes.

            Drivers in America seem to be a different beast from the rest of the world, tho. I watch a lot of those Russian dashcam vids, and here in America we just never see that level of uncoordinated, oblivious, and "Rules of the road? Whuzzat??"

            --
            And there is no Alkibiades to come back and save us from ourselves.
          • (Score: 2) by acid andy on Tuesday May 12 2015, @05:32PM

            by acid andy (1683) on Tuesday May 12 2015, @05:32PM (#182013) Homepage Journal

            You're obviously not a back seat driver. It's easy when someone else is driving to find yourself still going through the motions of checking it's clear at junctions, watching for hazards etc. I'd imagine if you had a suspicion that your self-driving car wasn't 100% accident proof that you'd be doing the same in it.

            --
            If a cat has kittens, does a rat have rittens, a bat bittens and a mat mittens?
  • (Score: 2) by gidds on Monday May 11 2015, @08:14PM

    by gidds (589) on Monday May 11 2015, @08:14PM (#181620)

    I agree.  But I fear it won't be as easy as that.

    After all, everyone thinks they're an above-average driver.  (I do myself!)

    So you have to convince people that autonomous cars are not just safer than average, but safer than them...

    --
    [sig redacted]
    • (Score: 2) by isostatic on Monday May 11 2015, @08:47PM

      by isostatic (365) on Monday May 11 2015, @08:47PM (#181638) Journal

      First thing you do is convince them they are safer than your average taxi driver. That won't be hard.