Stories
Slash Boxes
Comments

SoylentNews is people

posted by LaminatorX on Monday May 11 2015, @06:39PM   Printer-friendly
from the better-mousetrap dept.

According to an article by the AP - via an ad-free site several of the self driving cars licensed to drive in California have been involved in accidents.

Most are slow speed accidents, apparently with no injuries.

Four of the nearly 50 self-driving cars now rolling around California have gotten into accidents since September, when the state began issuing permits for companies to test them on public roads. Two accidents happened while the cars were in control; in the other two, the person who still must be behind the wheel was driving, a person familiar with the accident reports told The Associated Press.

Three involved Lexus SUVs that Google Inc. outfitted with sensors and computing power in its aggressive effort to develop "autonomous driving," a goal the tech giant shares with traditional automakers. The parts supplier Delphi Automotive had the other accident with one of its two test vehicles. Google and Delphi said their cars were not at fault in any accidents, which the companies said were minor.

Neither the companies involved, nor the State of California will release details of these accidents, which rankles some critics.

Four accidents involving these 50 cars in 8 months may seem a little high. Google's 23 cars have driven 140,000 miles in that time and racked up 3 accidents all by them selves. That is an order of magnitude higher than the National Transportation Safety Board's figures of 0.3 per 100,000 for non injury accidents. However the NTSB doesn't collect all fender bender accidents.

The article says that none of the other states that permit self driving cars have any record of accidents.

Related Stories

Self-parking Volvo Ploughs Into Journalists - Updated 15 comments

[Update: 05/28 23:38 GMT by mrcoolbp : It appears the driver was testing the auto-braking and/or pedestrian detection packages that the car didn't seem to have. The human driver was in control of the vehicle. This is an at-fault driver, not a 'self-driving' incident. We apologize for any confusion.]

As a group of journalists gathered in the Dominican Republic to report on the self-parking Volvo XC60 (video of the accident available), the group watched as the car reversed itself, then drove into the crowd at speed:

The accident may have happened because owners have to pay for a special feature known as "pedestrian detection functionality," which costs extra. The cars do have auto-braking features as standard, but only for avoiding other cars — if they are to avoid crashing into pedestrians, too, then owners must pay extra.

"It appears as if the car in this video is not equipped with Pedestrian detection," Volvo spokesperson Johan Larsson told Fusion. "This is sold as a separate package."

The pedestrian detection feature, which works using radar behind the grill and a camera in the windshield, costs approximately $3000. The two men injured in the accident were bruised but otherwise OK.


[Editor's Comment: Original Submission]

Google to Publish Details of All its Self-Driving Accidents 24 comments

In response to reports that their self-driving cars have not been totally free from accidents, Google has created a webpage where it will publish monthly reports detailing all of the accidents that its self-driving cars are involved in.

The first report [PDF] includes summaries of all accidents since the start of the Google X project in 2009:

The report for May showed Google cars had been involved in 12 accidents since it first began testing its self-driving cars in 2009, mostly involving rear-ending. Google said one of its vehicles was rear-ended at a stoplight in California on Thursday, bringing the total count to 13 accidents.

"That could mean that the vehicles tend to stop more quickly than human drivers expect," public interest group Consumer Watchdog said. The group called for more details on the accidents, including statements from witnesses and other drivers.

None of these accidents were caused by a fault with the car, Google said.


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Interesting) by ikanreed on Monday May 11 2015, @06:54PM

    by ikanreed (3164) Subscriber Badge on Monday May 11 2015, @06:54PM (#181582) Journal

    this one [medium.com], where they claim 11 accidents have happened, none of them caused by the autonomous cars.

    Being "involved" in an accident doesn't mean causing an accident. The writer of that medium article could be a damned liar. Marketing has been less honest than that in the past, but this reads as someone spoiling for a fight with new technology, and finding an excuse.

    In fact, the Medium story illuminates a probable cause for the variation in statistics. City driving is apparently a lot worse. Who'dve guessed?

    • (Score: 3, Insightful) by Snotnose on Monday May 11 2015, @07:36PM

      by Snotnose (1623) on Monday May 11 2015, @07:36PM (#181601)

      Lessee. They get in crashes at a much higher rate than human-driven cars. And they aren't responsible for any of them.

      Sounds like autonomous cars do things human drivers don't expect. Things that make human drivers run into them. What those things might be I don't know, but how much of your daily driving is habit and ritual?

      --
      Why shouldn't we judge a book by it's cover? It's got the author, title, and a summary of what the book's about.
      • (Score: 2) by wonkey_monkey on Monday May 11 2015, @07:38PM

        by wonkey_monkey (279) on Monday May 11 2015, @07:38PM (#181602) Homepage

        Or, or, being experimental vehicles with millions (or even billions) of dollars riding on their roadworthiness, they do a lot more miles than most cars.

        --
        systemd is Roko's Basilisk
        • (Score: 4, Informative) by ikanreed on Monday May 11 2015, @07:48PM

          by ikanreed (3164) Subscriber Badge on Monday May 11 2015, @07:48PM (#181605) Journal

          Oh come on. Read the whole summary. 3 per 140,000 for these cars versus 0.3 per 100,000 typical. That equation don't balance. Even factoring in that 55% of accidents aren't reported, it's still triple the average rate.

          However their cars have been disproportionately involved in city driving, which has a much higher risk ratio(And that's undocumented by the NTSB's figures). The net result of that critical piece of ignorance is that it's virtually impossible to derive meaningful conclusions.

          • (Score: 3, Insightful) by Mr Big in the Pants on Monday May 11 2015, @07:59PM

            by Mr Big in the Pants (4956) on Monday May 11 2015, @07:59PM (#181613)

            The fact they do not is incompetent.
            In my country statistics NZ count those figures including a lot of other factors.
            I found that out when I checked out the difference on motorcycle accidents and the difference between fatal city and urban crashes.

            And all of this ignores that no one was hurt and thus these crashes are the sort that don't necessarily end up in their stats. Not to mention the cars are in beta and thus one would expect more crashes now than in the future. Not to mention who was at fault in the crashes. etc etc

            This is a load of bullshit all round to be honest.

            And even if at this stage they are higher - who cares? If they end up being much much lower and will change our society for the better long term then everybody just needs to STFU right now!

            • (Score: 0, Troll) by Anonymous Coward on Monday May 11 2015, @09:13PM

              by Anonymous Coward on Monday May 11 2015, @09:13PM (#181649)

              There's another factor to consider besides how safe these cars are: Proprietary software and surveillance. Clearly computers will be more integral to these cars than for other cars, so that provides more opportunity for companies and the government to violate people's privacy. They must have free software, or they should be rejected.

            • (Score: 0, Disagree) by Anonymous Coward on Tuesday May 12 2015, @01:09AM

              by Anonymous Coward on Tuesday May 12 2015, @01:09AM (#181735)

              If they end up being much much lower and will change our society for the better long term then everybody just needs to STFU right now!

              Reasoning that could be used to shut down any discussion isn't reasoning at all.

          • (Score: 5, Insightful) by pe1rxq on Monday May 11 2015, @10:51PM

            by pe1rxq (844) on Monday May 11 2015, @10:51PM (#181688) Homepage

            I think it is way to early to compare numbers like this. Accidents are counted in integer increments (you can't have one third of an accident), and 3 is not really a large enough number to do proper statistics with.
            By the time the google cars go from over a million miles this number will become interesting. Untill then we might just be looking at statistical noise.

            • (Score: 2) by ikanreed on Tuesday May 12 2015, @01:19PM

              by ikanreed (3164) Subscriber Badge on Tuesday May 12 2015, @01:19PM (#181920) Journal

              Also true, but I didn't want to seem defensive pulling the "You just can't know man" card.

      • (Score: 3, Insightful) by Snotnose on Monday May 11 2015, @08:09PM

        by Snotnose (1623) on Monday May 11 2015, @08:09PM (#181616)

        I just realized something. What if the human driver sees the autonomous vehicle and is so "gee whiz, that's one of them thar self driven thingies! Look ma, no hands! Wow, lookit that spinning thing on the BAM ooops".

        --
        Why shouldn't we judge a book by it's cover? It's got the author, title, and a summary of what the book's about.
        • (Score: 2, Interesting) by kurenai.tsubasa on Monday May 11 2015, @10:41PM

          by kurenai.tsubasa (5227) on Monday May 11 2015, @10:41PM (#181685) Journal

          Seems to me it'd be a combination of the two. There's certainly going to be some amount of rubbernecking, and I'm sure that doesn't help.

          It also occurred to me earlier today that the one thing autonomous vehicles will prevent others around them from doing is employing key #5 of the Smith System [yahoo.com], specifically the part about making eye contact (sorry about Yahoo answers, was the top result on Google). If I'm driving around a robo-car, how can I make sure it sees me? I can't make eye contact with the vision systems in use. I really just have to assume that the software is that good.

          Applying the 5 keys of the Smith System from the point of view of a robo-car:

          1. Aim high in steering: this should be trivial assuming the computer can accurately infer traffic 5 or 10 cars ahead of itself, although I can only assume this is a monster of a machine vision problem.
          2. Get the big picture: this is more difficult of a machine vision problem. Can these things even read standard street signs yet? What about the inevitable detour one encounters at least once a summer where half the signs are wrong?
          3. Keep your eyes moving: done and done. Caveat is just figuring out what it's seeing.
          4. Leave yourself an out: like aim high in steering, this should be trivial and will only be held back by machine vision.
          5. Make sure they see you: Ahhh... I'm imagining a paradise of being surrounded by robo-cars that always use their blinkers correctly, always remember to turn their headlights on when the windshield wipers are on, etc, etc. I'm sure facial recognition is probably even at a point today where it'd be able to at least note whether I've looked in its direction. This one gets even better past a certain critical mass of robo-cars on the road all communicating with each other on some giant proximity or mesh network.

          Overall, though, we're left hopelessly speculating without knowing exactly what the two collisions were that happened while the computer was in control.

      • (Score: 4, Insightful) by tftp on Tuesday May 12 2015, @12:23AM

        by tftp (806) on Tuesday May 12 2015, @12:23AM (#181719) Homepage

        Sounds like autonomous cars do things human drivers don't expect. Things that make human drivers run into them.

        Perhaps things like minor violations of the law for sake of safety? People usually don't slam on the brakes if the yellow light comes on when they are a few yards away from the intersection. What does a robot car do? What does the driver behind the robot car expect the robot car to do? Does he even have the physical luxury of an instant reaction? And, of course, if a collision occurs... the robot car is not at fault. It would be the fault of the driver behind who was timing his reaction to a human behavior of the driver ahead.

        There is yet another thing that automated cars won't be able to do at all until they get an AI. That is prediction of actions of others. Humans do it all the time based on subtle hints. For example, if someone drives straight but keeps looking left, chances are he is considering changing lanes. If a car blinks the turn signal for seventeen miles, it's probably not because the driver is about to make a turn any moment now. If a car is moving to the edge of their lane, merging into yours is likely. One can predict many things based on all kinds of observations. It's easy to see the blinking arrows of road repair crews for several blocks ahead - and to change lane ahead of time, before you are facing the cones. (Merging at that time would be more difficult.) Humans see distracted drivers, tailgaters, suspected drunk drivers and stay away from them. A robot car would be unable to read any of that; it would be happily driving behind a vehicle that weaves across two lanes and oversteers to remain more or less within the road; a human would be already on the phone, calling 911 from a safe distance behind - but not the robot car.

        • (Score: 0, Troll) by Ethanol-fueled on Tuesday May 12 2015, @02:14AM

          by Ethanol-fueled (2792) on Tuesday May 12 2015, @02:14AM (#181755) Homepage

          I think it has more to do with the Lexus vehicles -- Lexus vehicles are overwhelmingly driven by Asians. Google engineers are overwhelmingly Asians. Asians are notoriously bad drivers. And in California, to deny a fact more painfully obvious here than anywhere else is too politically incorrect to touch, so of course all of those rice-boys and "exotic princesses" working for Google come out of the woodwork to vindicate their abnormally-high auto insurance rates.

          And accidents are the result. Danger and destruction of California highways for the sake of political correctness rather than evaluation of merit.

        • (Score: 2) by tangomargarine on Tuesday May 12 2015, @03:21AM

          by tangomargarine (667) on Tuesday May 12 2015, @03:21AM (#181777)

          If a car is moving to the edge of their lane, merging into yours is likely.

          Not where I'm driving, at least. The drivers in my city seem to be magnetically attracted to having their tires on the lane markings when they have no intention of switching lanes.

          Needless to say, I prefer to pass people as quickly as possible.

          --
          "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
      • (Score: 3, Insightful) by tangomargarine on Tuesday May 12 2015, @03:17AM

        by tangomargarine (667) on Tuesday May 12 2015, @03:17AM (#181774)

        Sounds like autonomous cars do things human drivers don't expect. Things that make human drivers run into them.

        I actually agree with this, but for a very different reason than you probably expect. I live in a big city (top 40 in the U.S.) and if there's one thing I expect of my fellow drivers in the city, it's that they suck at driving.

        I expect people to wander into my lane. I expect people to not signal (a relative in the area joked it's a sign of weakness). I expect people to drive recklessly.

        --
        "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
    • (Score: 4, Informative) by Katastic on Monday May 11 2015, @09:16PM

      by Katastic (3340) on Monday May 11 2015, @09:16PM (#181651)

      Yeah. This exact article was submitted to Slashdot earlier, and this one by contrast--with the same links--is written like an anti-self-driving-cars hit piece.

      We didn't leave Slashdot to have worse amounts of yellow journalism, bias, and exaggerations. Get your shit together submitter.

      • (Score: 5, Informative) by frojack on Tuesday May 12 2015, @12:34AM

        by frojack (1554) on Tuesday May 12 2015, @12:34AM (#181723) Journal

        I submitted this article a full 6 hours before the slashdot article hit.
        From that point on, I have no control of when it arrives on the page.

        As far as I can tell, both articles are slanted about the same as the story upon which they are based. The article was not all good news for self driving cars.

        I'm not in the business of re-writing the slant of stories I post. If I did that you'd be here taking me to task for that as well.

        --
        No, you are mistaken. I've always had this sig.
      • (Score: 0) by Anonymous Coward on Tuesday May 12 2015, @02:37PM

        by Anonymous Coward on Tuesday May 12 2015, @02:37PM (#181946)

        I'm seeing this story reported on all sorts of different web sites, and it's a good exercise in detecting spin.

        One of the common methods used is leaving out crucial details.

      • (Score: 3, Insightful) by Phoenix666 on Tuesday May 12 2015, @04:19PM

        by Phoenix666 (552) on Tuesday May 12 2015, @04:19PM (#181984) Journal

        I didn't submit this story, but I submit others. This is a community site, as in the value is the community and that community only has as much value as we put into it. So if you or anyone else has better submissions, please submit them.

        In practice sending in a submission takes about 10-15 minutes per if you make sure that the links are good, that you pull out representative sections from the article for the summary, and that you top that off with a title and line at the end with a conversation starter that aren't too salacious. Reading deep background on a subject and sifting out elements that might be spin would increase that submission prep to a half an hour per at least. Even if a person had that kind of free time, it would still probably be futile because no matter how careful you are there will always be someone who comes along and yells "bias!"

        That's why we have discussion, so that subject matter experts or those with inside knowledge can chime in and enlighten the rest of us. That's also more than you'll get from any "legit" media source out there, because there's always an editorial agenda or agenda from the publisher and you never get to talk back or push back on any of them. As in, when's the last time Bill O'Reilly bothered to have a discussion with average viewers who take issue with his "reporting?"

        For what it's worth, I frequently see posts on SN whose language does not agree with my worldview, but them's the breaks. It's a big world with lots of viewpoints in it.

        --
        Washington DC delenda est.
    • (Score: 4, Interesting) by vux984 on Monday May 11 2015, @10:11PM

      by vux984 (5045) on Monday May 11 2015, @10:11PM (#181671)

      Being "involved" in an accident doesn't mean causing an accident.

      You are quite right. Then again being found not legally at fault is not the same as not causing an accident either.

      And if the cars are involved in more accidents collectively than the national average; that correlation does suggest something is going on.

      The writer of that medium article could be a damned liar. Marketing has been less honest than that in the past, but this reads as someone spoiling for a fight with new technology, and finding an excuse.

      That could be too. I also thought I'd read somewhere most of the accidents they were involved were under human control at the times of the accident. Suggesting that maybe Google doesn't hire people for their driving skills... or perhaps sitting in a car all day not-driving it and then being asked to drive it when the car gives up leads to more mistakes than normal. Or maybe the human drivers are only driving when the car can't, because those are the most tricky maneuvers... meaning the cars safety record results from it cherry picking the easy stuff and avoiding doing the harder stuff. (Something which I suspect is at least partly true.)

      In fact, the Medium story illuminates a probable cause for the variation in statistics. City driving is apparently a lot worse. Who'dve guessed?

      Of course it is. But its not like driverless cars haven't racked up a lot of highway miles of their own too.

      • (Score: 4, Interesting) by vux984 on Monday May 11 2015, @10:18PM

        by vux984 (5045) on Monday May 11 2015, @10:18PM (#181674)

        And if the cars are involved in more accidents collectively than the national average; that correlation does suggest something is going on.

        To expand on this... one of the things drivers do for example, is establish eye contact with other drivers; at intersections etc. when you can do it that non-verbal communication conveys agreement about right-of-way or who is yielding to whom etc, increasing the safety of those maneuvers. Its not even remotely always possible (e.g. at night or in heavy rain and snow), etc but perhaps to the extent that it does happens reduces accidents. Perhaps interacting with vehicles that have no driver, means they never benefit from that. So while the cars themselves behave legally, they mis-cue other human drivers at a higher rate leading to slightly higher accident rates.

        *1 Do the self-driving have significant night driving records yet or in extreme weather?
        *2 It may also be that the cars not having a driver is itself distracting human drivers and leading to slightly higher accident rates by the humans around them.

        • (Score: 0) by Anonymous Coward on Monday May 11 2015, @10:32PM

          by Anonymous Coward on Monday May 11 2015, @10:32PM (#181679)

          Wouldn't that mean it's ultimately the fault of the humans, and that it would be better if all the cars were driverless and didn't rely on subjective cues?

        • (Score: 0) by Anonymous Coward on Tuesday May 12 2015, @07:20AM

          by Anonymous Coward on Tuesday May 12 2015, @07:20AM (#181839)

          The Mercedes concept car is supposed to have a feature (laser projector) to tell pedestrians that it's OK for them to walk in front of it.

          https://www.youtube.com/watch?v=1OSr8mxYED8#t=1m05s [youtube.com]

          So I'm sure they can figure something out.

  • (Score: 3, Insightful) by gnuman on Monday May 11 2015, @06:59PM

    by gnuman (5013) on Monday May 11 2015, @06:59PM (#181584)

    However the NTSB doesn't collect all fender bender accidents.

    Unlike for self-driving cars, including cars not under the actual control of the software.

    Anyway, there are expected to be at least minor issues on initial experimentation. I still prefer self-driving car safety over any texting "driver" I see out there. Self-driving cars can only improve.

    • (Score: 4, Insightful) by ikanreed on Monday May 11 2015, @07:05PM

      by ikanreed (3164) Subscriber Badge on Monday May 11 2015, @07:05PM (#181587) Journal

      Unreported fender benders are estimated to be approximately 55% of vehicular accidents, so that isn't exactly vindication, but I largely agree.

      Maybe a skilled, attentive human driver could have avoided the situations that lead to those accidents, but skilled, attentive drivers aren't exactly the norm.

      • (Score: 3, Interesting) by frojack on Tuesday May 12 2015, @12:19AM

        by frojack (1554) on Tuesday May 12 2015, @12:19AM (#181717) Journal

        If you take the federal numbers of .3, and double them to account for only 55% of them being reported, you still end up with .6 per 140,000 miles driven strictly by humans.

        That mental exercise serves two purposes:
        1) It points out that the 23 google cars experience minor accidents something approaching 5 times more often than human driven cars.
        2) It points out that sufficiently skilled and attentive drivers ARE pretty much the norm.

        That we have all seen the occasional texter driving and yet not one in 10000 of us** have actually seen a texter crash, suggests that texting while driving is more likely just people READING while driving, rather than actually texting. Because it its actually texting, you would have to come to the conclusion that texting while driving is a LOT less deadly than we are lead to believe.

        ** numbers pulled straight from my ass. Prove me wrong!
        Hell, I've seen cops texting while driving!!

        --
        No, you are mistaken. I've always had this sig.
    • (Score: 2) by Rivenaleem on Tuesday May 12 2015, @08:25AM

      by Rivenaleem (3400) on Tuesday May 12 2015, @08:25AM (#181857)

      They also only looked at data from California. They state that there have been no other accidents in other states that permit self driving cars, so how many cars in total are on the roads? They say that the amount of incidents is an order of magnitude higher than overall incidents, but they only have 48 of all the cars on the road in the sample group, because they chose to narrow the selection to California only.

      Where's the rest of the data?

  • (Score: 5, Insightful) by SecurityGuy on Monday May 11 2015, @07:16PM

    by SecurityGuy (1453) on Monday May 11 2015, @07:16PM (#181592)

    We should be happy with self driving cars as soon as they're demonstrably safer than we are. Human drivers are nowhere near accident free, and a small subset of them are downright unsafe (speeding, drunk, texting, etc). If you're going to be speeding while drunk and texting, I'd much rather you be in one of Google's cars and take my chances with them.

    • (Score: 4, Insightful) by snick on Monday May 11 2015, @07:49PM

      by snick (1408) on Monday May 11 2015, @07:49PM (#181606)

      We need to get over the concept of "standby driver" first. Having a person who is normally disengaged from the driving task take over when the situation becomes too complex for the computer is a recipe for disaster. If you aren't actively driving, you will not be engaged with the environment around the vehicle, and the amount of time between the computer deciding to bail, and the bad thing that scared the computer happening will usually be about enough for the "driver" look up. No time to asses the situation, and none to react.

      Autonomous cars aren't nearly as dangerous as semi-autonomous cars.

      • (Score: 2, Interesting) by Anonymous Coward on Monday May 11 2015, @08:43PM

        by Anonymous Coward on Monday May 11 2015, @08:43PM (#181633)

        I agree. A number of years ago I was driving on a major interstate at the start of a holiday weekend. I saw a vehicle a few hundred yards in front of me that entered the median. My initial thought was it was an unmarked cop car turning around to go after someone heading the opposite direction. About a second later I realized he was going much too fast to be turning around, and I mumbled something to the effect of "What the hell..." That was just enough time for my sister to look up and see the car plow into another car in the opposing lane. Traffic typically moves about 75-80 mph in that area, and the driver that crossed the median was killed. By the time I realized what I had seen the accident was a couple hundred yards behind me.

        Pretend this was an automated car. Five hours into a trip an alarm goes off. The time my sister had to look up and see the accident would likely be all the time you have. So you have a full second or two to hear the alarm, stop whatever you am doing, check your surroundings, determine the threat(s), brake and/or take evasive action. Under those conditions I would trust an automated car to make better and quicker decisions than me.

        • (Score: 3, Disagree) by Reziac on Tuesday May 12 2015, @03:33AM

          by Reziac (2489) on Tuesday May 12 2015, @03:33AM (#181781) Homepage

          I wouldn't. I've found that I react automatically and appropriately to a hazard situation before I'm consciously aware of doing so -- often before the hazard fully manifests. Thus I've avoided a number of accidents. Judging by what I've seen around me in many years driving in Los Angeles and across the western U.S., most people do the same. I'd say this anticipatory defensive driving is a standard feature; as Frojack points out, the stats inform us that most people *are* good drivers.

          --
          And there is no Alkibiades to come back and save us from ourselves.
          • (Score: 3, Insightful) by snick on Tuesday May 12 2015, @01:01PM

            by snick (1408) on Tuesday May 12 2015, @01:01PM (#181913)

            Sure, folks react automatically to hazards. But that is because they are already actively driving, and are immersed in the vehicle's situation when the hazard presents itself. Even if you don't consciously think about it, you are aware of what is in front of, behind, and on both sides of the car at all times, and have a general idea of all the relative velocities. (or at least you ought to)

            A standby driver isn't going to be immersed in the vehicle's situation. s/he will be having a conversation with the backseat passenger, or facebooking on his/her phone when the hazard presents. There is no way that you can put someone in the driver's seat, give them nothing to do, but expect them to be alert and engaged.

            Handing off driving responsibilities from the computer to the "driver" in a crisis is not like having a driver handle a crisis. It is like suddenly handing the controls to someone in the passenger seat who was taking a nap.

            • (Score: 2) by Reziac on Tuesday May 12 2015, @01:36PM

              by Reziac (2489) on Tuesday May 12 2015, @01:36PM (#181928) Homepage

              Exactly. Driving is not a part-time job; it's a state of awareness one gets into and stays in for the duration, which is why most of us do it well. One driver in a thousand might react appropriately when 'awakened' from their Facebook nap; the rest are more likely to panic and react badly or far too late. [Side note: I found that playing DOOM, where you can get ambushed from any angle at any moment, made me a better driver, more aware in all directions.]

              I think self-driving cars are likely to be all or nothing; if it's "some" there'll be too many vehicles acting in ways live drivers don't expect and can't properly anticipate, as the self-drivers behave in situationally-wrong ways (failing to anticipate from those tiny cues that live drivers catch). I expect our eventual adjustment will be to stay the hell away from any self-driving car we see on the road, or to confine their use to HOV-type lanes.

              Drivers in America seem to be a different beast from the rest of the world, tho. I watch a lot of those Russian dashcam vids, and here in America we just never see that level of uncoordinated, oblivious, and "Rules of the road? Whuzzat??"

              --
              And there is no Alkibiades to come back and save us from ourselves.
            • (Score: 2) by acid andy on Tuesday May 12 2015, @05:32PM

              by acid andy (1683) on Tuesday May 12 2015, @05:32PM (#182013) Homepage Journal

              You're obviously not a back seat driver. It's easy when someone else is driving to find yourself still going through the motions of checking it's clear at junctions, watching for hazards etc. I'd imagine if you had a suspicion that your self-driving car wasn't 100% accident proof that you'd be doing the same in it.

              --
              If a cat has kittens, does a rat have rittens, a bat bittens and a mat mittens?
    • (Score: 2) by gidds on Monday May 11 2015, @08:14PM

      by gidds (589) on Monday May 11 2015, @08:14PM (#181620)

      I agree.  But I fear it won't be as easy as that.

      After all, everyone thinks they're an above-average driver.  (I do myself!)

      So you have to convince people that autonomous cars are not just safer than average, but safer than them...

      --
      [sig redacted]
      • (Score: 2) by isostatic on Monday May 11 2015, @08:47PM

        by isostatic (365) on Monday May 11 2015, @08:47PM (#181638) Journal

        First thing you do is convince them they are safer than your average taxi driver. That won't be hard.

  • (Score: 3, Insightful) by sjames on Monday May 11 2015, @07:42PM

    by sjames (2882) on Monday May 11 2015, @07:42PM (#181604) Journal

    Even a pedestrian can be involved in a vehicular accident. From the 4 accidents in the report, we can cut that down to 2 immediately since the autonomous system wasn't driving. That just leaves 2 where the autonomous system was driving but the autonomous vehicle wasn't ruled at fault in either of those. Even the best driver can be involved in an accident if the other driver is an idiot.

    • (Score: 2) by snick on Monday May 11 2015, @07:55PM

      by snick (1408) on Monday May 11 2015, @07:55PM (#181611)

      From the 4 accidents in the report, we can cut that down to 2 immediately since the autonomous system wasn't driving.

      Is there any data on how long the hu-man had been in control before the 2 accidents you want to dismiss? If the autonomous system brought the car into a situation it couldn't deal with, so it gave control to the hu-man (who couldn't deal with it either) ... that should be counted as a failure of the autonomous system.

      • (Score: 2) by sjames on Monday May 11 2015, @08:14PM

        by sjames (2882) on Monday May 11 2015, @08:14PM (#181619) Journal

        TFA also reports that the collisions happened at under 10 MPH. It seems unlikely that there was a sudden handoff since about the only thing you can do in such a case is honk and hit the brakes.

        • (Score: 2) by snick on Monday May 11 2015, @08:31PM

          by snick (1408) on Monday May 11 2015, @08:31PM (#181629)

          TFA is light on details, so all of this is guessing. Could there have been a handoff @45 mph and the hu-man jamming on the brakes got it down to 10 mph before impact? Pure speculation.

          My guess is that it is policy that the hu-man take the car out of the parking lot in manual mode, and Google's parking lots are full of maniacs.

          • (Score: 2) by sjames on Monday May 11 2015, @08:44PM

            by sjames (2882) on Monday May 11 2015, @08:44PM (#181634) Journal

            TFA is light on details, so all of this is guessing. Could there have been a handoff @45 mph and the hu-man jamming on the brakes got it down to 10 mph before impact? Pure speculation.

            That seems like a fairly desperate effort to contort the story.

            My guess is that it is policy that the hu-man take the car out of the parking lot in manual mode, and Google's parking lots are full of maniacs.

            That seems likely closer to the facts.

    • (Score: 0) by Anonymous Coward on Tuesday May 12 2015, @01:11AM

      by Anonymous Coward on Tuesday May 12 2015, @01:11AM (#181736)

      but the autonomous vehicle wasn't ruled at fault in either of those

      California is a no-fault state. No one is ever at fault. Clever that.

  • (Score: 2) by wonkey_monkey on Monday May 11 2015, @07:53PM

    by wonkey_monkey (279) on Monday May 11 2015, @07:53PM (#181609) Homepage

    Self Driving Cars: Not so Accident Free after All

    First of all, who said there were ever going to be accident free while humans are still out there driving? Secondly, you could at least acknowledge that it's even possible that the accidents weren't the autonomous car's fault (which does in fact appear to be the case in all four accidents).

    That is an order of magnitude higher than the National Transportation Safety Board's figures of 0.3 per 100,000 for non injury accidents.

    What's the rate per 100,000 for all accidents?

    --
    systemd is Roko's Basilisk
    • (Score: 2) by Rivenaleem on Tuesday May 12 2015, @08:29AM

      by Rivenaleem (3400) on Tuesday May 12 2015, @08:29AM (#181860)

      I said it elsewhere, but they have also chosen to narrow their data selection to the one state where there were accidents, and not include the other cars from the other states where testing is happening, but there have been no accidents at all. What's the total number of autonomous cars on the road, how many total miles have they racked up, and are we still only at 4 accidents for all of them, and none of them involved any injury (and as you say, only 2 of them were under autonomous control)

  • (Score: 4, Insightful) by MrGuy on Monday May 11 2015, @07:56PM

    by MrGuy (1007) on Monday May 11 2015, @07:56PM (#181612)

    Self Driving Cars: Not so Accident Free after All

    Seriously? We let a smarmy "gotcha" headline like this through? When even the text of TFS states "Google and Delphi said their cars were not at fault in any accidents."

    Any car on a public road can be involved in an accident, regardless of how well it's driven, if it's, y'know, hit by another driver. There's nothing in the article that suggests that the self-driving cars were EVER at fault in any accident.

    Not saying it couldn't be the case that the performance is less than advertised, and maybe some unreleased information would suggest this. But god damn, that's a quick jump to conclusion for a headline.

    • (Score: 3, Insightful) by snick on Monday May 11 2015, @08:10PM

      by snick (1408) on Monday May 11 2015, @08:10PM (#181618)

      yes and no.

      Just today, I wasn't in an accident (that wouldn't have been my fault) when the truck in the next lane drifted into my lane. Saying that the cars were not at fault in any of the accidents that they have been in really doesn't tell us _anything_ about how good they are at accident avoidance.

      • (Score: 0) by Anonymous Coward on Monday May 11 2015, @08:45PM

        by Anonymous Coward on Monday May 11 2015, @08:45PM (#181635)

        Good point. The fact is nobody is a perfect driver, we all make mistakes and that includes everyone here including myself. Driving is a dynamic among different drivers attempting not to make mistakes and attempting to anticipate the possible mistakes of others and compensate. So many times others on the roads have made mistakes that would have lead to an accident but because I compensated for them they were avoided. Likewise I've made mistakes that others have compensated for and I realized my mistake after the fact. I, like others, also attempt to learn from our mistakes and learn how to avoid mistakes so that we don't make them again. For instance some streets and intersections are poorly designed and the first time traversing them can be dangerous. However, with experience we learn how to avoid mistakes and better compensate for the possible mistakes of others on those streets and intersections. We may also learn the habits of drivers within various areas to compensate for them as well. This type of intelligence maybe still lacking with autonomous vehicles which may apply stringent rules that work well if you assume that all other drivers are themselves autonomous vehicles perfectly applying those same rules but don't always work when you assume the other driver maybe an erroneous human that needs you to compensate for their fallibility.

        A perfect example of this is backing out of a driveway. When driving the side streets, at least here in California, the law states that the person backing out is responsible for an accident. Still, while I'm backing out and there is a car parked on the curve restricting my field of vision the driver passing by may anticipate my difficulty in seeing around the parked car and compensate or honk to let me know they're coming. Likewise I may do the same thing when driving and seeing someone else attempting to back out.

        An advantage that an autonomous vehicle could have is if you can include things like radar to detect other cars and objects. Vision via camera or the eyes can be limited under certain conditions. Adding radar and perhaps other radiation spectra could provide for advantages under various different (weather and lighting) conditions.

        Another thing to consider is autonomous vehicle maintenance. As the condition of the car changes (ie: the wheels get misaligned) will the autonomous vehicle compensate? Different roads may require the car to apply different amounts of power to accomplish the same acceleration and braking dynamics. Depending on the condition of the roads, how steep a hill is, how much a road maybe slanted in one direction or another, the fact that speed bumps may vary, etc... how will an autonomous vehicle compensate under these various conditions and learn to adapt or anticipate safe driving habits. As the condition of the sensors that the computer uses to collect data from (ie: camera) changes (ie: the camera gets fogged up) will the vehicle warn the driver that it's maintenance time (and will this become intentionally costly).

        • (Score: 2) by Grishnakh on Monday May 11 2015, @09:46PM

          by Grishnakh (2831) on Monday May 11 2015, @09:46PM (#181660)

          A perfect example of this is backing out of a driveway.

          An advantage that an autonomous vehicle could have is if you can include things like radar to detect other cars and objects.

          You don't need an autonomous vehicle for this, you can go out and buy a car like this right this minute, for around $25-30k. A bunch of cars, including the Mazda3 (higher-end packages only), have blind-spot detection warning systems which use radar in the back bumper, and because of this, have the additional feature of sounding an alarm if you're backing out and it detects any cross traffic.

    • (Score: 0) by Anonymous Coward on Tuesday May 12 2015, @01:14AM

      by Anonymous Coward on Tuesday May 12 2015, @01:14AM (#181737)

      Google and Delphi said their cars were not at fault in any accidents.

      Again, California is a no-fault state. It is impossible for any car or driver to be at fault for any accident.

  • (Score: 2) by gidds on Monday May 11 2015, @08:18PM

    by gidds (589) on Monday May 11 2015, @08:18PM (#181622)

    ...but I can't help wondering who might stand to lose from autonomous vehicles, and whether they'd be prepared to deliberately causejust happen to get into accidents with them...

    --
    [sig redacted]
    • (Score: 0) by Anonymous Coward on Tuesday May 12 2015, @02:51PM

      by Anonymous Coward on Tuesday May 12 2015, @02:51PM (#181949)

      well, the auto industry might not want google to monopolize the self driving tech, but on the other hand by licensing such tech they would not have any more problems of planned obsolescence, because the auto would (shudder) become like the pc or smartphone, switch to a newer model every X years or else.

      People might not want self driving cars, but the will of the people counts only when people are independent, and technically, in multi-national economic systems, people are not.

  • (Score: 3, Insightful) by maxwell demon on Monday May 11 2015, @08:19PM

    by maxwell demon (1608) on Monday May 11 2015, @08:19PM (#181624) Journal

    From the summary:

    Two accidents happened while the cars were in control; in the other two, the person who still must be behind the wheel was driving
    So in two of the four accidents the self-driving feature was not involved; the car was driven as a normal car that happened to carry some extra electronics around.

    Assuming that the cars were travelling a larger distance in self-driving mode than in driver-controlled mode (after all, their purpose was to test the self-driving feature), this means that the automatic driver was better than the human one. Although the difference should not be statistically significant.

    --
    The Tao of math: The numbers you can count are not the real numbers.
    • (Score: 2) by maxwell demon on Monday May 11 2015, @08:25PM

      by maxwell demon (1608) on Monday May 11 2015, @08:25PM (#181627) Journal

      I could have sworn I closed the blockquote tags … the second paragraph should have been outside the quote.
      Reminder to self: Don't forget to preview!

      --
      The Tao of math: The numbers you can count are not the real numbers.
      • (Score: 2) by maxwell demon on Monday May 11 2015, @08:28PM

        by maxwell demon (1608) on Monday May 11 2015, @08:28PM (#181628) Journal

        Other reminder to self: And check more carefully the content. Of course the quote already ends inside the first paragraph … only the first line is actually quoted (though on small screens, it might be displayed on more than one line).

        --
        The Tao of math: The numbers you can count are not the real numbers.
  • (Score: 2) by takyon on Monday May 11 2015, @08:48PM

    by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Monday May 11 2015, @08:48PM (#181639) Journal

    Nobody seems to have hit on this detail yet:

    The person familiar with the accident reports said the cars were in self-driving mode in two of the four accidents, all of which involved speeds of less than 10 mph.

    The details are confidential, but it could have involved as little as someone (human) not looking while backing out of a driveway or parking space.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 2) by FatPhil on Wednesday May 13 2015, @08:51AM

      by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Wednesday May 13 2015, @08:51AM (#182293) Homepage
      For a different slant:
      http://www.independent.co.uk/life-style/gadgets-and-tech/news/googles-selfdriving-cars-have-crashed-11-times-in-recent-years--because-humans-keep-driving-into-them-company-claims-10243398.html
      """
      The self-driving cars had not once been “the cause of the accident”, wrote Chris Urmson, the director of Google’s self-driving car programme. Urmson said that most of the crashes had been people driving into the back of the cars, “mainly at traffic lights but also on the freeway”, and some of the other crashes are thought to have happened while the car was being driven by a human.
      """
      --
      Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
  • (Score: 3, Funny) by MichaelDavidCrawford on Monday May 11 2015, @10:11PM

    ... when suicide bombers don't need to commit suicide anymore.

    --
    Yes I Have No Bananas. [gofundme.com]
    • (Score: 3, Touché) by takyon on Monday May 11 2015, @10:31PM

      by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Monday May 11 2015, @10:31PM (#181678) Journal

      No, automated cars will just free up car bombers to commit suicide bombings in new ways, like on foot.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 2) by bob_super on Tuesday May 12 2015, @12:16AM

      by bob_super (1357) on Tuesday May 12 2015, @12:16AM (#181715)

      People of afghanistan would like to talk to you about lives saved when a US-made autonomous vehicle goes around doing the killing.

      I can totally envision the side-effects of stolen credit cards + Uber + autonomous vehicles: one bad guy could attach many areas at once without even having to ride on top of his own microwave bomb.

    • (Score: 1, Insightful) by Anonymous Coward on Tuesday May 12 2015, @01:19AM

      by Anonymous Coward on Tuesday May 12 2015, @01:19AM (#181739)
      Well, we have cruise missiles, but 9/11 was an attempt to do the same thing manually.
      • (Score: 2) by MichaelDavidCrawford on Tuesday May 12 2015, @03:02AM

        it was the first port of call of a Russian naval vessel since 1943. I'm not sure but expect it was a Soviet ship at first.

        It was equipped with roughly ten cruise missile silos. The San Francisco Chronicle mentioned that the cruise missiles could take out another ship at a distance of 3,000 miles.

        My father was a US naval antiaircraft missile fire control officer. My degree is in physics while I've never worked with nuclear weapons I've studied them quiet a bit.

        The whole time I was waiting in line to tour the ship, I was staring silently, solemnly and fearfully at the launchers.

        Were just one of them used in anger, that missile wouldn't just take out a ship.

        --
        Yes I Have No Bananas. [gofundme.com]