Stories
Slash Boxes
Comments

SoylentNews is people

posted by LaminatorX on Monday May 11 2015, @06:39PM   Printer-friendly
from the better-mousetrap dept.

According to an article by the AP - via an ad-free site several of the self driving cars licensed to drive in California have been involved in accidents.

Most are slow speed accidents, apparently with no injuries.

Four of the nearly 50 self-driving cars now rolling around California have gotten into accidents since September, when the state began issuing permits for companies to test them on public roads. Two accidents happened while the cars were in control; in the other two, the person who still must be behind the wheel was driving, a person familiar with the accident reports told The Associated Press.

Three involved Lexus SUVs that Google Inc. outfitted with sensors and computing power in its aggressive effort to develop "autonomous driving," a goal the tech giant shares with traditional automakers. The parts supplier Delphi Automotive had the other accident with one of its two test vehicles. Google and Delphi said their cars were not at fault in any accidents, which the companies said were minor.

Neither the companies involved, nor the State of California will release details of these accidents, which rankles some critics.

Four accidents involving these 50 cars in 8 months may seem a little high. Google's 23 cars have driven 140,000 miles in that time and racked up 3 accidents all by them selves. That is an order of magnitude higher than the National Transportation Safety Board's figures of 0.3 per 100,000 for non injury accidents. However the NTSB doesn't collect all fender bender accidents.

The article says that none of the other states that permit self driving cars have any record of accidents.

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Interesting) by ikanreed on Monday May 11 2015, @06:54PM

    by ikanreed (3164) Subscriber Badge on Monday May 11 2015, @06:54PM (#181582) Journal

    this one [medium.com], where they claim 11 accidents have happened, none of them caused by the autonomous cars.

    Being "involved" in an accident doesn't mean causing an accident. The writer of that medium article could be a damned liar. Marketing has been less honest than that in the past, but this reads as someone spoiling for a fight with new technology, and finding an excuse.

    In fact, the Medium story illuminates a probable cause for the variation in statistics. City driving is apparently a lot worse. Who'dve guessed?

    Starting Score:    1  point
    Moderation   +2  
       Interesting=2, Total=2
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 3, Insightful) by Snotnose on Monday May 11 2015, @07:36PM

    by Snotnose (1623) on Monday May 11 2015, @07:36PM (#181601)

    Lessee. They get in crashes at a much higher rate than human-driven cars. And they aren't responsible for any of them.

    Sounds like autonomous cars do things human drivers don't expect. Things that make human drivers run into them. What those things might be I don't know, but how much of your daily driving is habit and ritual?

    --
    When the dust settled America realized it was saved by a porn star.
    • (Score: 2) by wonkey_monkey on Monday May 11 2015, @07:38PM

      by wonkey_monkey (279) on Monday May 11 2015, @07:38PM (#181602) Homepage

      Or, or, being experimental vehicles with millions (or even billions) of dollars riding on their roadworthiness, they do a lot more miles than most cars.

      --
      systemd is Roko's Basilisk
      • (Score: 4, Informative) by ikanreed on Monday May 11 2015, @07:48PM

        by ikanreed (3164) Subscriber Badge on Monday May 11 2015, @07:48PM (#181605) Journal

        Oh come on. Read the whole summary. 3 per 140,000 for these cars versus 0.3 per 100,000 typical. That equation don't balance. Even factoring in that 55% of accidents aren't reported, it's still triple the average rate.

        However their cars have been disproportionately involved in city driving, which has a much higher risk ratio(And that's undocumented by the NTSB's figures). The net result of that critical piece of ignorance is that it's virtually impossible to derive meaningful conclusions.

        • (Score: 3, Insightful) by Mr Big in the Pants on Monday May 11 2015, @07:59PM

          by Mr Big in the Pants (4956) on Monday May 11 2015, @07:59PM (#181613)

          The fact they do not is incompetent.
          In my country statistics NZ count those figures including a lot of other factors.
          I found that out when I checked out the difference on motorcycle accidents and the difference between fatal city and urban crashes.

          And all of this ignores that no one was hurt and thus these crashes are the sort that don't necessarily end up in their stats. Not to mention the cars are in beta and thus one would expect more crashes now than in the future. Not to mention who was at fault in the crashes. etc etc

          This is a load of bullshit all round to be honest.

          And even if at this stage they are higher - who cares? If they end up being much much lower and will change our society for the better long term then everybody just needs to STFU right now!

          • (Score: 0, Troll) by Anonymous Coward on Monday May 11 2015, @09:13PM

            by Anonymous Coward on Monday May 11 2015, @09:13PM (#181649)

            There's another factor to consider besides how safe these cars are: Proprietary software and surveillance. Clearly computers will be more integral to these cars than for other cars, so that provides more opportunity for companies and the government to violate people's privacy. They must have free software, or they should be rejected.

          • (Score: 0, Disagree) by Anonymous Coward on Tuesday May 12 2015, @01:09AM

            by Anonymous Coward on Tuesday May 12 2015, @01:09AM (#181735)

            If they end up being much much lower and will change our society for the better long term then everybody just needs to STFU right now!

            Reasoning that could be used to shut down any discussion isn't reasoning at all.

        • (Score: 5, Insightful) by pe1rxq on Monday May 11 2015, @10:51PM

          by pe1rxq (844) on Monday May 11 2015, @10:51PM (#181688) Homepage

          I think it is way to early to compare numbers like this. Accidents are counted in integer increments (you can't have one third of an accident), and 3 is not really a large enough number to do proper statistics with.
          By the time the google cars go from over a million miles this number will become interesting. Untill then we might just be looking at statistical noise.

          • (Score: 2) by ikanreed on Tuesday May 12 2015, @01:19PM

            by ikanreed (3164) Subscriber Badge on Tuesday May 12 2015, @01:19PM (#181920) Journal

            Also true, but I didn't want to seem defensive pulling the "You just can't know man" card.

    • (Score: 3, Insightful) by Snotnose on Monday May 11 2015, @08:09PM

      by Snotnose (1623) on Monday May 11 2015, @08:09PM (#181616)

      I just realized something. What if the human driver sees the autonomous vehicle and is so "gee whiz, that's one of them thar self driven thingies! Look ma, no hands! Wow, lookit that spinning thing on the BAM ooops".

      --
      When the dust settled America realized it was saved by a porn star.
      • (Score: 2, Interesting) by kurenai.tsubasa on Monday May 11 2015, @10:41PM

        by kurenai.tsubasa (5227) on Monday May 11 2015, @10:41PM (#181685) Journal

        Seems to me it'd be a combination of the two. There's certainly going to be some amount of rubbernecking, and I'm sure that doesn't help.

        It also occurred to me earlier today that the one thing autonomous vehicles will prevent others around them from doing is employing key #5 of the Smith System [yahoo.com], specifically the part about making eye contact (sorry about Yahoo answers, was the top result on Google). If I'm driving around a robo-car, how can I make sure it sees me? I can't make eye contact with the vision systems in use. I really just have to assume that the software is that good.

        Applying the 5 keys of the Smith System from the point of view of a robo-car:

        1. Aim high in steering: this should be trivial assuming the computer can accurately infer traffic 5 or 10 cars ahead of itself, although I can only assume this is a monster of a machine vision problem.
        2. Get the big picture: this is more difficult of a machine vision problem. Can these things even read standard street signs yet? What about the inevitable detour one encounters at least once a summer where half the signs are wrong?
        3. Keep your eyes moving: done and done. Caveat is just figuring out what it's seeing.
        4. Leave yourself an out: like aim high in steering, this should be trivial and will only be held back by machine vision.
        5. Make sure they see you: Ahhh... I'm imagining a paradise of being surrounded by robo-cars that always use their blinkers correctly, always remember to turn their headlights on when the windshield wipers are on, etc, etc. I'm sure facial recognition is probably even at a point today where it'd be able to at least note whether I've looked in its direction. This one gets even better past a certain critical mass of robo-cars on the road all communicating with each other on some giant proximity or mesh network.

        Overall, though, we're left hopelessly speculating without knowing exactly what the two collisions were that happened while the computer was in control.

    • (Score: 4, Insightful) by tftp on Tuesday May 12 2015, @12:23AM

      by tftp (806) on Tuesday May 12 2015, @12:23AM (#181719) Homepage

      Sounds like autonomous cars do things human drivers don't expect. Things that make human drivers run into them.

      Perhaps things like minor violations of the law for sake of safety? People usually don't slam on the brakes if the yellow light comes on when they are a few yards away from the intersection. What does a robot car do? What does the driver behind the robot car expect the robot car to do? Does he even have the physical luxury of an instant reaction? And, of course, if a collision occurs... the robot car is not at fault. It would be the fault of the driver behind who was timing his reaction to a human behavior of the driver ahead.

      There is yet another thing that automated cars won't be able to do at all until they get an AI. That is prediction of actions of others. Humans do it all the time based on subtle hints. For example, if someone drives straight but keeps looking left, chances are he is considering changing lanes. If a car blinks the turn signal for seventeen miles, it's probably not because the driver is about to make a turn any moment now. If a car is moving to the edge of their lane, merging into yours is likely. One can predict many things based on all kinds of observations. It's easy to see the blinking arrows of road repair crews for several blocks ahead - and to change lane ahead of time, before you are facing the cones. (Merging at that time would be more difficult.) Humans see distracted drivers, tailgaters, suspected drunk drivers and stay away from them. A robot car would be unable to read any of that; it would be happily driving behind a vehicle that weaves across two lanes and oversteers to remain more or less within the road; a human would be already on the phone, calling 911 from a safe distance behind - but not the robot car.

      • (Score: 0, Troll) by Ethanol-fueled on Tuesday May 12 2015, @02:14AM

        by Ethanol-fueled (2792) on Tuesday May 12 2015, @02:14AM (#181755) Homepage

        I think it has more to do with the Lexus vehicles -- Lexus vehicles are overwhelmingly driven by Asians. Google engineers are overwhelmingly Asians. Asians are notoriously bad drivers. And in California, to deny a fact more painfully obvious here than anywhere else is too politically incorrect to touch, so of course all of those rice-boys and "exotic princesses" working for Google come out of the woodwork to vindicate their abnormally-high auto insurance rates.

        And accidents are the result. Danger and destruction of California highways for the sake of political correctness rather than evaluation of merit.

      • (Score: 2) by tangomargarine on Tuesday May 12 2015, @03:21AM

        by tangomargarine (667) on Tuesday May 12 2015, @03:21AM (#181777)

        If a car is moving to the edge of their lane, merging into yours is likely.

        Not where I'm driving, at least. The drivers in my city seem to be magnetically attracted to having their tires on the lane markings when they have no intention of switching lanes.

        Needless to say, I prefer to pass people as quickly as possible.

        --
        "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
    • (Score: 3, Insightful) by tangomargarine on Tuesday May 12 2015, @03:17AM

      by tangomargarine (667) on Tuesday May 12 2015, @03:17AM (#181774)

      Sounds like autonomous cars do things human drivers don't expect. Things that make human drivers run into them.

      I actually agree with this, but for a very different reason than you probably expect. I live in a big city (top 40 in the U.S.) and if there's one thing I expect of my fellow drivers in the city, it's that they suck at driving.

      I expect people to wander into my lane. I expect people to not signal (a relative in the area joked it's a sign of weakness). I expect people to drive recklessly.

      --
      "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
  • (Score: 4, Informative) by Katastic on Monday May 11 2015, @09:16PM

    by Katastic (3340) on Monday May 11 2015, @09:16PM (#181651)

    Yeah. This exact article was submitted to Slashdot earlier, and this one by contrast--with the same links--is written like an anti-self-driving-cars hit piece.

    We didn't leave Slashdot to have worse amounts of yellow journalism, bias, and exaggerations. Get your shit together submitter.

    • (Score: 5, Informative) by frojack on Tuesday May 12 2015, @12:34AM

      by frojack (1554) on Tuesday May 12 2015, @12:34AM (#181723) Journal

      I submitted this article a full 6 hours before the slashdot article hit.
      From that point on, I have no control of when it arrives on the page.

      As far as I can tell, both articles are slanted about the same as the story upon which they are based. The article was not all good news for self driving cars.

      I'm not in the business of re-writing the slant of stories I post. If I did that you'd be here taking me to task for that as well.

      --
      No, you are mistaken. I've always had this sig.
    • (Score: 0) by Anonymous Coward on Tuesday May 12 2015, @02:37PM

      by Anonymous Coward on Tuesday May 12 2015, @02:37PM (#181946)

      I'm seeing this story reported on all sorts of different web sites, and it's a good exercise in detecting spin.

      One of the common methods used is leaving out crucial details.

    • (Score: 3, Insightful) by Phoenix666 on Tuesday May 12 2015, @04:19PM

      by Phoenix666 (552) on Tuesday May 12 2015, @04:19PM (#181984) Journal

      I didn't submit this story, but I submit others. This is a community site, as in the value is the community and that community only has as much value as we put into it. So if you or anyone else has better submissions, please submit them.

      In practice sending in a submission takes about 10-15 minutes per if you make sure that the links are good, that you pull out representative sections from the article for the summary, and that you top that off with a title and line at the end with a conversation starter that aren't too salacious. Reading deep background on a subject and sifting out elements that might be spin would increase that submission prep to a half an hour per at least. Even if a person had that kind of free time, it would still probably be futile because no matter how careful you are there will always be someone who comes along and yells "bias!"

      That's why we have discussion, so that subject matter experts or those with inside knowledge can chime in and enlighten the rest of us. That's also more than you'll get from any "legit" media source out there, because there's always an editorial agenda or agenda from the publisher and you never get to talk back or push back on any of them. As in, when's the last time Bill O'Reilly bothered to have a discussion with average viewers who take issue with his "reporting?"

      For what it's worth, I frequently see posts on SN whose language does not agree with my worldview, but them's the breaks. It's a big world with lots of viewpoints in it.

      --
      Washington DC delenda est.
  • (Score: 4, Interesting) by vux984 on Monday May 11 2015, @10:11PM

    by vux984 (5045) on Monday May 11 2015, @10:11PM (#181671)

    Being "involved" in an accident doesn't mean causing an accident.

    You are quite right. Then again being found not legally at fault is not the same as not causing an accident either.

    And if the cars are involved in more accidents collectively than the national average; that correlation does suggest something is going on.

    The writer of that medium article could be a damned liar. Marketing has been less honest than that in the past, but this reads as someone spoiling for a fight with new technology, and finding an excuse.

    That could be too. I also thought I'd read somewhere most of the accidents they were involved were under human control at the times of the accident. Suggesting that maybe Google doesn't hire people for their driving skills... or perhaps sitting in a car all day not-driving it and then being asked to drive it when the car gives up leads to more mistakes than normal. Or maybe the human drivers are only driving when the car can't, because those are the most tricky maneuvers... meaning the cars safety record results from it cherry picking the easy stuff and avoiding doing the harder stuff. (Something which I suspect is at least partly true.)

    In fact, the Medium story illuminates a probable cause for the variation in statistics. City driving is apparently a lot worse. Who'dve guessed?

    Of course it is. But its not like driverless cars haven't racked up a lot of highway miles of their own too.

    • (Score: 4, Interesting) by vux984 on Monday May 11 2015, @10:18PM

      by vux984 (5045) on Monday May 11 2015, @10:18PM (#181674)

      And if the cars are involved in more accidents collectively than the national average; that correlation does suggest something is going on.

      To expand on this... one of the things drivers do for example, is establish eye contact with other drivers; at intersections etc. when you can do it that non-verbal communication conveys agreement about right-of-way or who is yielding to whom etc, increasing the safety of those maneuvers. Its not even remotely always possible (e.g. at night or in heavy rain and snow), etc but perhaps to the extent that it does happens reduces accidents. Perhaps interacting with vehicles that have no driver, means they never benefit from that. So while the cars themselves behave legally, they mis-cue other human drivers at a higher rate leading to slightly higher accident rates.

      *1 Do the self-driving have significant night driving records yet or in extreme weather?
      *2 It may also be that the cars not having a driver is itself distracting human drivers and leading to slightly higher accident rates by the humans around them.

      • (Score: 0) by Anonymous Coward on Monday May 11 2015, @10:32PM

        by Anonymous Coward on Monday May 11 2015, @10:32PM (#181679)

        Wouldn't that mean it's ultimately the fault of the humans, and that it would be better if all the cars were driverless and didn't rely on subjective cues?

      • (Score: 0) by Anonymous Coward on Tuesday May 12 2015, @07:20AM

        by Anonymous Coward on Tuesday May 12 2015, @07:20AM (#181839)

        The Mercedes concept car is supposed to have a feature (laser projector) to tell pedestrians that it's OK for them to walk in front of it.

        https://www.youtube.com/watch?v=1OSr8mxYED8#t=1m05s [youtube.com]

        So I'm sure they can figure something out.