Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 18 submissions in the queue.
posted by LaminatorX on Monday May 11 2015, @06:39PM   Printer-friendly
from the better-mousetrap dept.

According to an article by the AP - via an ad-free site several of the self driving cars licensed to drive in California have been involved in accidents.

Most are slow speed accidents, apparently with no injuries.

Four of the nearly 50 self-driving cars now rolling around California have gotten into accidents since September, when the state began issuing permits for companies to test them on public roads. Two accidents happened while the cars were in control; in the other two, the person who still must be behind the wheel was driving, a person familiar with the accident reports told The Associated Press.

Three involved Lexus SUVs that Google Inc. outfitted with sensors and computing power in its aggressive effort to develop "autonomous driving," a goal the tech giant shares with traditional automakers. The parts supplier Delphi Automotive had the other accident with one of its two test vehicles. Google and Delphi said their cars were not at fault in any accidents, which the companies said were minor.

Neither the companies involved, nor the State of California will release details of these accidents, which rankles some critics.

Four accidents involving these 50 cars in 8 months may seem a little high. Google's 23 cars have driven 140,000 miles in that time and racked up 3 accidents all by them selves. That is an order of magnitude higher than the National Transportation Safety Board's figures of 0.3 per 100,000 for non injury accidents. However the NTSB doesn't collect all fender bender accidents.

The article says that none of the other states that permit self driving cars have any record of accidents.

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Insightful) by MrGuy on Monday May 11 2015, @07:56PM

    by MrGuy (1007) on Monday May 11 2015, @07:56PM (#181612)

    Self Driving Cars: Not so Accident Free after All

    Seriously? We let a smarmy "gotcha" headline like this through? When even the text of TFS states "Google and Delphi said their cars were not at fault in any accidents."

    Any car on a public road can be involved in an accident, regardless of how well it's driven, if it's, y'know, hit by another driver. There's nothing in the article that suggests that the self-driving cars were EVER at fault in any accident.

    Not saying it couldn't be the case that the performance is less than advertised, and maybe some unreleased information would suggest this. But god damn, that's a quick jump to conclusion for a headline.

    Starting Score:    1  point
    Moderation   +2  
       Insightful=2, Total=2
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 3, Insightful) by snick on Monday May 11 2015, @08:10PM

    by snick (1408) on Monday May 11 2015, @08:10PM (#181618)

    yes and no.

    Just today, I wasn't in an accident (that wouldn't have been my fault) when the truck in the next lane drifted into my lane. Saying that the cars were not at fault in any of the accidents that they have been in really doesn't tell us _anything_ about how good they are at accident avoidance.

    • (Score: 0) by Anonymous Coward on Monday May 11 2015, @08:45PM

      by Anonymous Coward on Monday May 11 2015, @08:45PM (#181635)

      Good point. The fact is nobody is a perfect driver, we all make mistakes and that includes everyone here including myself. Driving is a dynamic among different drivers attempting not to make mistakes and attempting to anticipate the possible mistakes of others and compensate. So many times others on the roads have made mistakes that would have lead to an accident but because I compensated for them they were avoided. Likewise I've made mistakes that others have compensated for and I realized my mistake after the fact. I, like others, also attempt to learn from our mistakes and learn how to avoid mistakes so that we don't make them again. For instance some streets and intersections are poorly designed and the first time traversing them can be dangerous. However, with experience we learn how to avoid mistakes and better compensate for the possible mistakes of others on those streets and intersections. We may also learn the habits of drivers within various areas to compensate for them as well. This type of intelligence maybe still lacking with autonomous vehicles which may apply stringent rules that work well if you assume that all other drivers are themselves autonomous vehicles perfectly applying those same rules but don't always work when you assume the other driver maybe an erroneous human that needs you to compensate for their fallibility.

      A perfect example of this is backing out of a driveway. When driving the side streets, at least here in California, the law states that the person backing out is responsible for an accident. Still, while I'm backing out and there is a car parked on the curve restricting my field of vision the driver passing by may anticipate my difficulty in seeing around the parked car and compensate or honk to let me know they're coming. Likewise I may do the same thing when driving and seeing someone else attempting to back out.

      An advantage that an autonomous vehicle could have is if you can include things like radar to detect other cars and objects. Vision via camera or the eyes can be limited under certain conditions. Adding radar and perhaps other radiation spectra could provide for advantages under various different (weather and lighting) conditions.

      Another thing to consider is autonomous vehicle maintenance. As the condition of the car changes (ie: the wheels get misaligned) will the autonomous vehicle compensate? Different roads may require the car to apply different amounts of power to accomplish the same acceleration and braking dynamics. Depending on the condition of the roads, how steep a hill is, how much a road maybe slanted in one direction or another, the fact that speed bumps may vary, etc... how will an autonomous vehicle compensate under these various conditions and learn to adapt or anticipate safe driving habits. As the condition of the sensors that the computer uses to collect data from (ie: camera) changes (ie: the camera gets fogged up) will the vehicle warn the driver that it's maintenance time (and will this become intentionally costly).

      • (Score: 2) by Grishnakh on Monday May 11 2015, @09:46PM

        by Grishnakh (2831) on Monday May 11 2015, @09:46PM (#181660)

        A perfect example of this is backing out of a driveway.

        An advantage that an autonomous vehicle could have is if you can include things like radar to detect other cars and objects.

        You don't need an autonomous vehicle for this, you can go out and buy a car like this right this minute, for around $25-30k. A bunch of cars, including the Mazda3 (higher-end packages only), have blind-spot detection warning systems which use radar in the back bumper, and because of this, have the additional feature of sounding an alarm if you're backing out and it detects any cross traffic.

  • (Score: 0) by Anonymous Coward on Tuesday May 12 2015, @01:14AM

    by Anonymous Coward on Tuesday May 12 2015, @01:14AM (#181737)

    Google and Delphi said their cars were not at fault in any accidents.

    Again, California is a no-fault state. It is impossible for any car or driver to be at fault for any accident.