Stories
Slash Boxes
Comments

SoylentNews is people

posted by LaminatorX on Monday May 11 2015, @06:39PM   Printer-friendly
from the better-mousetrap dept.

According to an article by the AP - via an ad-free site several of the self driving cars licensed to drive in California have been involved in accidents.

Most are slow speed accidents, apparently with no injuries.

Four of the nearly 50 self-driving cars now rolling around California have gotten into accidents since September, when the state began issuing permits for companies to test them on public roads. Two accidents happened while the cars were in control; in the other two, the person who still must be behind the wheel was driving, a person familiar with the accident reports told The Associated Press.

Three involved Lexus SUVs that Google Inc. outfitted with sensors and computing power in its aggressive effort to develop "autonomous driving," a goal the tech giant shares with traditional automakers. The parts supplier Delphi Automotive had the other accident with one of its two test vehicles. Google and Delphi said their cars were not at fault in any accidents, which the companies said were minor.

Neither the companies involved, nor the State of California will release details of these accidents, which rankles some critics.

Four accidents involving these 50 cars in 8 months may seem a little high. Google's 23 cars have driven 140,000 miles in that time and racked up 3 accidents all by them selves. That is an order of magnitude higher than the National Transportation Safety Board's figures of 0.3 per 100,000 for non injury accidents. However the NTSB doesn't collect all fender bender accidents.

The article says that none of the other states that permit self driving cars have any record of accidents.

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Insightful) by Snotnose on Monday May 11 2015, @07:36PM

    by Snotnose (1623) on Monday May 11 2015, @07:36PM (#181601)

    Lessee. They get in crashes at a much higher rate than human-driven cars. And they aren't responsible for any of them.

    Sounds like autonomous cars do things human drivers don't expect. Things that make human drivers run into them. What those things might be I don't know, but how much of your daily driving is habit and ritual?

    --
    When the dust settled America realized it was saved by a porn star.
    Starting Score:    1  point
    Moderation   +1  
       Insightful=1, Total=1
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 2) by wonkey_monkey on Monday May 11 2015, @07:38PM

    by wonkey_monkey (279) on Monday May 11 2015, @07:38PM (#181602) Homepage

    Or, or, being experimental vehicles with millions (or even billions) of dollars riding on their roadworthiness, they do a lot more miles than most cars.

    --
    systemd is Roko's Basilisk
    • (Score: 4, Informative) by ikanreed on Monday May 11 2015, @07:48PM

      by ikanreed (3164) Subscriber Badge on Monday May 11 2015, @07:48PM (#181605) Journal

      Oh come on. Read the whole summary. 3 per 140,000 for these cars versus 0.3 per 100,000 typical. That equation don't balance. Even factoring in that 55% of accidents aren't reported, it's still triple the average rate.

      However their cars have been disproportionately involved in city driving, which has a much higher risk ratio(And that's undocumented by the NTSB's figures). The net result of that critical piece of ignorance is that it's virtually impossible to derive meaningful conclusions.

      • (Score: 3, Insightful) by Mr Big in the Pants on Monday May 11 2015, @07:59PM

        by Mr Big in the Pants (4956) on Monday May 11 2015, @07:59PM (#181613)

        The fact they do not is incompetent.
        In my country statistics NZ count those figures including a lot of other factors.
        I found that out when I checked out the difference on motorcycle accidents and the difference between fatal city and urban crashes.

        And all of this ignores that no one was hurt and thus these crashes are the sort that don't necessarily end up in their stats. Not to mention the cars are in beta and thus one would expect more crashes now than in the future. Not to mention who was at fault in the crashes. etc etc

        This is a load of bullshit all round to be honest.

        And even if at this stage they are higher - who cares? If they end up being much much lower and will change our society for the better long term then everybody just needs to STFU right now!

        • (Score: 0, Troll) by Anonymous Coward on Monday May 11 2015, @09:13PM

          by Anonymous Coward on Monday May 11 2015, @09:13PM (#181649)

          There's another factor to consider besides how safe these cars are: Proprietary software and surveillance. Clearly computers will be more integral to these cars than for other cars, so that provides more opportunity for companies and the government to violate people's privacy. They must have free software, or they should be rejected.

        • (Score: 0, Disagree) by Anonymous Coward on Tuesday May 12 2015, @01:09AM

          by Anonymous Coward on Tuesday May 12 2015, @01:09AM (#181735)

          If they end up being much much lower and will change our society for the better long term then everybody just needs to STFU right now!

          Reasoning that could be used to shut down any discussion isn't reasoning at all.

      • (Score: 5, Insightful) by pe1rxq on Monday May 11 2015, @10:51PM

        by pe1rxq (844) on Monday May 11 2015, @10:51PM (#181688) Homepage

        I think it is way to early to compare numbers like this. Accidents are counted in integer increments (you can't have one third of an accident), and 3 is not really a large enough number to do proper statistics with.
        By the time the google cars go from over a million miles this number will become interesting. Untill then we might just be looking at statistical noise.

        • (Score: 2) by ikanreed on Tuesday May 12 2015, @01:19PM

          by ikanreed (3164) Subscriber Badge on Tuesday May 12 2015, @01:19PM (#181920) Journal

          Also true, but I didn't want to seem defensive pulling the "You just can't know man" card.

  • (Score: 3, Insightful) by Snotnose on Monday May 11 2015, @08:09PM

    by Snotnose (1623) on Monday May 11 2015, @08:09PM (#181616)

    I just realized something. What if the human driver sees the autonomous vehicle and is so "gee whiz, that's one of them thar self driven thingies! Look ma, no hands! Wow, lookit that spinning thing on the BAM ooops".

    --
    When the dust settled America realized it was saved by a porn star.
    • (Score: 2, Interesting) by kurenai.tsubasa on Monday May 11 2015, @10:41PM

      by kurenai.tsubasa (5227) on Monday May 11 2015, @10:41PM (#181685) Journal

      Seems to me it'd be a combination of the two. There's certainly going to be some amount of rubbernecking, and I'm sure that doesn't help.

      It also occurred to me earlier today that the one thing autonomous vehicles will prevent others around them from doing is employing key #5 of the Smith System [yahoo.com], specifically the part about making eye contact (sorry about Yahoo answers, was the top result on Google). If I'm driving around a robo-car, how can I make sure it sees me? I can't make eye contact with the vision systems in use. I really just have to assume that the software is that good.

      Applying the 5 keys of the Smith System from the point of view of a robo-car:

      1. Aim high in steering: this should be trivial assuming the computer can accurately infer traffic 5 or 10 cars ahead of itself, although I can only assume this is a monster of a machine vision problem.
      2. Get the big picture: this is more difficult of a machine vision problem. Can these things even read standard street signs yet? What about the inevitable detour one encounters at least once a summer where half the signs are wrong?
      3. Keep your eyes moving: done and done. Caveat is just figuring out what it's seeing.
      4. Leave yourself an out: like aim high in steering, this should be trivial and will only be held back by machine vision.
      5. Make sure they see you: Ahhh... I'm imagining a paradise of being surrounded by robo-cars that always use their blinkers correctly, always remember to turn their headlights on when the windshield wipers are on, etc, etc. I'm sure facial recognition is probably even at a point today where it'd be able to at least note whether I've looked in its direction. This one gets even better past a certain critical mass of robo-cars on the road all communicating with each other on some giant proximity or mesh network.

      Overall, though, we're left hopelessly speculating without knowing exactly what the two collisions were that happened while the computer was in control.

  • (Score: 4, Insightful) by tftp on Tuesday May 12 2015, @12:23AM

    by tftp (806) on Tuesday May 12 2015, @12:23AM (#181719) Homepage

    Sounds like autonomous cars do things human drivers don't expect. Things that make human drivers run into them.

    Perhaps things like minor violations of the law for sake of safety? People usually don't slam on the brakes if the yellow light comes on when they are a few yards away from the intersection. What does a robot car do? What does the driver behind the robot car expect the robot car to do? Does he even have the physical luxury of an instant reaction? And, of course, if a collision occurs... the robot car is not at fault. It would be the fault of the driver behind who was timing his reaction to a human behavior of the driver ahead.

    There is yet another thing that automated cars won't be able to do at all until they get an AI. That is prediction of actions of others. Humans do it all the time based on subtle hints. For example, if someone drives straight but keeps looking left, chances are he is considering changing lanes. If a car blinks the turn signal for seventeen miles, it's probably not because the driver is about to make a turn any moment now. If a car is moving to the edge of their lane, merging into yours is likely. One can predict many things based on all kinds of observations. It's easy to see the blinking arrows of road repair crews for several blocks ahead - and to change lane ahead of time, before you are facing the cones. (Merging at that time would be more difficult.) Humans see distracted drivers, tailgaters, suspected drunk drivers and stay away from them. A robot car would be unable to read any of that; it would be happily driving behind a vehicle that weaves across two lanes and oversteers to remain more or less within the road; a human would be already on the phone, calling 911 from a safe distance behind - but not the robot car.

    • (Score: 0, Troll) by Ethanol-fueled on Tuesday May 12 2015, @02:14AM

      by Ethanol-fueled (2792) on Tuesday May 12 2015, @02:14AM (#181755) Homepage

      I think it has more to do with the Lexus vehicles -- Lexus vehicles are overwhelmingly driven by Asians. Google engineers are overwhelmingly Asians. Asians are notoriously bad drivers. And in California, to deny a fact more painfully obvious here than anywhere else is too politically incorrect to touch, so of course all of those rice-boys and "exotic princesses" working for Google come out of the woodwork to vindicate their abnormally-high auto insurance rates.

      And accidents are the result. Danger and destruction of California highways for the sake of political correctness rather than evaluation of merit.

    • (Score: 2) by tangomargarine on Tuesday May 12 2015, @03:21AM

      by tangomargarine (667) on Tuesday May 12 2015, @03:21AM (#181777)

      If a car is moving to the edge of their lane, merging into yours is likely.

      Not where I'm driving, at least. The drivers in my city seem to be magnetically attracted to having their tires on the lane markings when they have no intention of switching lanes.

      Needless to say, I prefer to pass people as quickly as possible.

      --
      "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
  • (Score: 3, Insightful) by tangomargarine on Tuesday May 12 2015, @03:17AM

    by tangomargarine (667) on Tuesday May 12 2015, @03:17AM (#181774)

    Sounds like autonomous cars do things human drivers don't expect. Things that make human drivers run into them.

    I actually agree with this, but for a very different reason than you probably expect. I live in a big city (top 40 in the U.S.) and if there's one thing I expect of my fellow drivers in the city, it's that they suck at driving.

    I expect people to wander into my lane. I expect people to not signal (a relative in the area joked it's a sign of weakness). I expect people to drive recklessly.

    --
    "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"