Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 13 submissions in the queue.
posted by n1 on Monday June 08 2015, @05:03AM   Printer-friendly
from the uninsured-self-drivers dept.

In response to reports that their self-driving cars have not been totally free from accidents, Google has created a webpage where it will publish monthly reports detailing all of the accidents that its self-driving cars are involved in.

The first report [PDF] includes summaries of all accidents since the start of the Google X project in 2009:

The report for May showed Google cars had been involved in 12 accidents since it first began testing its self-driving cars in 2009, mostly involving rear-ending. Google said one of its vehicles was rear-ended at a stoplight in California on Thursday, bringing the total count to 13 accidents.

"That could mean that the vehicles tend to stop more quickly than human drivers expect," public interest group Consumer Watchdog said. The group called for more details on the accidents, including statements from witnesses and other drivers.

None of these accidents were caused by a fault with the car, Google said.


Original Submission

Related Stories

Self Driving Cars: Not so Accident Free after All 63 comments

According to an article by the AP - via an ad-free site several of the self driving cars licensed to drive in California have been involved in accidents.

Most are slow speed accidents, apparently with no injuries.

Four of the nearly 50 self-driving cars now rolling around California have gotten into accidents since September, when the state began issuing permits for companies to test them on public roads. Two accidents happened while the cars were in control; in the other two, the person who still must be behind the wheel was driving, a person familiar with the accident reports told The Associated Press.

Three involved Lexus SUVs that Google Inc. outfitted with sensors and computing power in its aggressive effort to develop "autonomous driving," a goal the tech giant shares with traditional automakers. The parts supplier Delphi Automotive had the other accident with one of its two test vehicles. Google and Delphi said their cars were not at fault in any accidents, which the companies said were minor.

Neither the companies involved, nor the State of California will release details of these accidents, which rankles some critics.

Four accidents involving these 50 cars in 8 months may seem a little high. Google's 23 cars have driven 140,000 miles in that time and racked up 3 accidents all by them selves. That is an order of magnitude higher than the National Transportation Safety Board's figures of 0.3 per 100,000 for non injury accidents. However the NTSB doesn't collect all fender bender accidents.

The article says that none of the other states that permit self driving cars have any record of accidents.

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2, Funny) by Anonymous Coward on Monday June 08 2015, @05:20AM

    by Anonymous Coward on Monday June 08 2015, @05:20AM (#193519)

    Firstly, at least three persons shall be employed to drive or conduct such locomotive, and if more than two waggons or carriages he attached thereto, an additional person shall be employed, who shall take charge of such waggons or carriages :
    Secondly, one of such persons, while any locomotive is in motion, shall precede such locomotive on foot by not less than sixty yards, and shall carry a red flag constantly displayed, and shall warn the riders and drivers of horses of the approach of such locomotives, and shall signal the driver thereof when it shall be necessary to stop, and shall assist horses, and carriages drawn by horses, passing the same.

    • (Score: 3, Informative) by c0lo on Monday June 08 2015, @09:16AM

      by c0lo (156) Subscriber Badge on Monday June 08 2015, @09:16AM (#193586) Journal
      [citation] [wikipedia.org].
      Pennsylvania version

      would require all motorists piloting their "horseless carriages", upon chance encounters with cattle or livestock to (1) immediately stop the vehicle, (2) "immediately and as rapidly as possible... disassemble the automobile," and (3) "conceal the various components out of sight, behind nearby bushes" until equestrian or livestock is sufficiently pacified.

      --
      https://www.youtube.com/@ProfSteveKeen https://soylentnews.org/~MichaelDavidCrawford
  • (Score: -1, Flamebait) by Anonymous Coward on Monday June 08 2015, @05:30AM

    by Anonymous Coward on Monday June 08 2015, @05:30AM (#193522)

    "None of these accidents were caused by a fault with the car."

    That's right. All of them were.

    • (Score: 0) by Anonymous Coward on Monday June 08 2015, @05:34AM

      by Anonymous Coward on Monday June 08 2015, @05:34AM (#193523)

      We need to get them out of the driver's seat. Then they will pay us to do their driving for them.

      • (Score: 2, Funny) by Anonymous Coward on Monday June 08 2015, @05:44AM

        by Anonymous Coward on Monday June 08 2015, @05:44AM (#193528)

        The problem is the transition period, like it was suggested by some prankster that the UK should switch to right side traffic like the rest of the world but could do so with a transition period where only heavy traffic would switch the first week...

        • (Score: 0) by Anonymous Coward on Monday June 08 2015, @05:47AM

          by Anonymous Coward on Monday June 08 2015, @05:47AM (#193529)

          Let's SWITCH self driving CARS to IPv6!

          • (Score: 4, Funny) by c0lo on Monday June 08 2015, @09:26AM

            by c0lo (156) Subscriber Badge on Monday June 08 2015, @09:26AM (#193589) Journal

            Let's SWITCH self driving CARS to IPv6!

            Mate, hand over the geek card!
            IPv6 is at the network level, switching occurs at OSI level 2 (Data link - MAC based). The correct proposal would be

            Let's ROUTE self driving CARS to IPv6!

            (ducks)

            --
            https://www.youtube.com/@ProfSteveKeen https://soylentnews.org/~MichaelDavidCrawford
    • (Score: 1, Insightful) by Anonymous Coward on Monday June 08 2015, @06:34AM

      by Anonymous Coward on Monday June 08 2015, @06:34AM (#193540)

      How do you figure?

      In my country, if I rear-end the car in front of me (assuming it did not pull out in front of me) then the responsibility of the accident is mine.

      It's quite simple: I was either following too closely, driving too fast (which is the same thing), or I wasn't paying sufficient attention. The driver in front was innocent of all wrongdoing.

      • (Score: 0) by Anonymous Coward on Monday June 08 2015, @09:18AM

        by Anonymous Coward on Monday June 08 2015, @09:18AM (#193587)

        Well, if the robot breaks very sharply it might cause an accident. Usually the party doing the rearing however will be liable. This does not stop dozens of cars regularly taking part if destruction derby style clusterfucks that block highways potentially for hours, usually at rush hour.

        • (Score: 0) by Anonymous Coward on Monday June 08 2015, @04:45PM

          by Anonymous Coward on Monday June 08 2015, @04:45PM (#193717)

          >Well, if the robot breaks very sharply it might cause an accident.

          If the robot breaks, then it was at fault. The other driver must brake to avoid hitting the broken robot.

        • (Score: 3, Insightful) by sjames on Monday June 08 2015, @07:00PM

          by sjames (2882) on Monday June 08 2015, @07:00PM (#193765) Journal

          Again, that's following too close. You are supposed to leave enough gap that even if the car in front suddenly brakes at it's maximum deceleration suddenly, you have enough time to react and come to a stop.

  • (Score: 4, Interesting) by zafiro17 on Monday June 08 2015, @05:42AM

    by zafiro17 (234) on Monday June 08 2015, @05:42AM (#193527) Homepage

    Think what you will about the usefulness of the project. Google should at least get credit for attempting bold things, and for conducting science in a way that seems - to this casual observer at least - relatively transparent. There are scientists out there doing the same thing, but not all of them, and increasingly as science in fields like pharmaceuticals and medicine get consumed by business interests, the work is being done in almost total secrecy. Yeah, yeah, it costs money to develop the next [drug that sounds like Be-agra] blah blah blah. I've got a friend who works for the medical publishing industry and he says it's rotten to the core.

    Keep it honest, Google. And get us that automated transport system so the dumbasses I know are no longer responsible for being responsible. My friends are sucky drivers, really. Also, imagine not having to drive your car, how many more enjoyable things you could do with that time:?

    PS: this post got flagged for spam and lameness because I used a drug brand name. Seems the lameness filter could use some fine-tuning.

    --
    Dad always thought laughter was the best medicine, which I guess is why several of us died of tuberculosis - Jack Handey
    • (Score: 2, Interesting) by anubi on Monday June 08 2015, @05:56AM

      by anubi (2828) on Monday June 08 2015, @05:56AM (#193531) Journal

      I am behind Google on this one. I am not saying I would freely give up anyone's right to steer his own car any more than I would argue that people can't shift gears when they want to.

      I drive a manual car, but I had good reason for wanting a manual car. When I bought it, I lived in mountainous terrain. There were things I knew about the road there was no way an automatic transmission could know about.

      I am getting older now, and I personally feel its getting time to delegate my driving to a machine. Actually, I think it would do a better job of it than I can. I could still take over if its some backwater dirt road, but for all this stop-and-go city driving, I had just as soon the machine do it.

      I do not think my mom would have tried to outdo her sewing machine either, and I sure as heck appreciate my clothes washing machine.

      Google, I wish you the best of luck with this... and I look forward to seeing this in a van. I just hope its available before I am no longer around.

      --
      "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
  • (Score: 2) by Non Sequor on Monday June 08 2015, @07:25AM

    by Non Sequor (1005) on Monday June 08 2015, @07:25AM (#193553) Journal

    If you get in an accident and contact a shyster lawyer, they file a suit against the other driver, their employer (f they were driving on the job or even driving to work in a work vehicle), the city, and whatever company last worked on the road where the accident occurred, and anyone else even tangentially involved.

    You can usually go fishing for "fault" and have a good chance at a settlement. Any self-driving car project is likely going to either need to be granted immunity from these lawsuits or it needs to have expected settlement costs priced in.

    I haven't seen anything that suggests that Google has picked up on this yet, although I would expect that car companies adding semi-automated features may be contemplating it.

    --
    Write your congressman. Tell him he sucks.
    • (Score: 1) by anubi on Monday June 08 2015, @07:52AM

      by anubi (2828) on Monday June 08 2015, @07:52AM (#193561) Journal

      I would think a Google car would be able to dump megabytes of images and data detailing exactly what happened and its response to it.

      My guess is ( barring some bad computer programming ) its going to be hard as hell to nail it on Google.

      I guess you have seen those saws they test by sticking a frankfurter ( simulating a finger ) up to them to see how fast they stop.

      I would not be surprised if a Google car stops for anything moving into its collision path. Much faster than a human could respond.

      It will probably not take long for the motoring public to learn to not tailgate a Google car - and that Google cars will honor the exact letter of the law down to the tiniest nit.

      If anything, I could see people taunting a Google car so as to force it to obstruct traffic and annoy the passenger.

      --
      "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
      • (Score: 2) by Non Sequor on Monday June 08 2015, @08:22AM

        by Non Sequor (1005) on Monday June 08 2015, @08:22AM (#193570) Journal

        Any actual trial would be based on subjective decisions relative to tests prescribed by case law for determining if the defendant took appropriate measures to prevent the accident. My understanding is that these do reflect cost-to-benefit analysis on a subjective basis. The trial result depends on the jury's judgment of how the facts of the case as presented relate to the tests given by the judge.

        It's a crap shoot basically, although to some that's a hard problem to avoid. Regardless, the fact that there are always more measures that can be taken to improve safety, you can't preemptively resolve the issue with data. The data supports your side of the case, but it doesn't lock it down.

        Whoops, and I forgot to mention it's cheaper to settle than to go to court anyways. These plaintiffs will typically accept a settlement priced based on this. It's disproportionately expensive to go to trial with these cases just for the sake of discouraging them.

        --
        Write your congressman. Tell him he sucks.
        • (Score: 1) by Dr Spin on Monday June 08 2015, @09:15AM

          by Dr Spin (5239) on Monday June 08 2015, @09:15AM (#193584)

          As a general rule, "trial by jury" involves the use af a jury: 12 people probably with very limited grasp of any relevant concept and a deep wish to be elsewhere. You may get justice, or anything else could happen.

          Google needs to invent serious money in taking down "lawyers are us" and all similar organized scumbags. (including patent trolls).

          --
          Warning: Opening your mouth may invalidate your brain!
        • (Score: 2) by HiThere on Monday June 08 2015, @07:09PM

          by HiThere (866) on Monday June 08 2015, @07:09PM (#193767) Journal

          If Google becomes known as an easy mark, it will cost them a lot more then being known as someone who will fight to the last penny, unless you have a prima facie case.

          --
          Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
          • (Score: 2) by Non Sequor on Monday June 08 2015, @10:53PM

            by Non Sequor (1005) on Monday June 08 2015, @10:53PM (#193852) Journal

            Existing situations with similar dynamics haven't gone towards a deterrence based equilibrium. What makes Google different?

            Patent disputes may be a situation where you could say the equilibrium matches your prediction of aggressive and costly defense to avoid shakedowns. That situation is different in that the attackers tend to be somewhat larger and more heavily invested in single cases, where in the accident liability cases the attackers are smaller and less heavily invested in single cases.

            --
            Write your congressman. Tell him he sucks.
      • (Score: 3, Insightful) by t-3 on Monday June 08 2015, @08:53AM

        by t-3 (4907) on Monday June 08 2015, @08:53AM (#193575)

        Worse than taunting, I can see people intentionally causing accidents and fucking over self driving cars. Aside from the stuff weather and nature throws at cars all time, intentional human malice is a huge problem. I can't wait for the day when I see criminals fuck over cops by throwing shit out the window.

    • (Score: 4, Insightful) by bradley13 on Monday June 08 2015, @12:35PM

      by bradley13 (3053) on Monday June 08 2015, @12:35PM (#193614) Homepage Journal

      This is what no fault insurance is for. There may need to be some tweaks to make it work for autonomous vehicles, but basically: if you get in an accident, your insurance pays your costs and the other person's insurance pays their costs.

      Civil "get rich quick" suits files by shyster lawyers? That's mostly an American problem, and easily fixed by two things. First, switch to "loser pays", so that you can't play the lottery for free. Second, US courts need to be much more willing to penalize shyster lawyers. Look at Prenda [wikipedia.org]: More than two years after a court was finally willing to call them on their dirty little game, they still have not had their day of reckoning. They haven't paid their fines yet, they haven't been disbarred; in fact, they are busily pursuing new, equally shady shakedown schemes [techdirt.com].

      --
      Everyone is somebody else's weirdo.
  • (Score: 2) by gman003 on Monday June 08 2015, @02:29PM

    by gman003 (4155) on Monday June 08 2015, @02:29PM (#193658)

    According to their report, their cars have driven a bit over a million miles in autonomous mode, and nearly 800K in manual mode.

    That manual-mode number seems rather high. If that's a realistic representation of how often you'll have to engage manual mode in regular driving, that means the autonomous mode will only be working 55% of the time.

    Is there something inflating that number? One of the accidents was from "personal use" of the car - is that mileage counted as testing? Do they drive to the test area in manual mode?

    What I'd like to see is a breakdown of "manual override" miles (where the vehicle was in autonomous mode but needed human intervention) versus "manual non-testing" miles (where the vehicle wasn't supposed to be in autonomous mode). That would give us a better idea of how often autonomous mode fails (it's obviously a clean failure mode, since it hasn't caused accidents, but it's still a failure mode).

    • (Score: 3, Funny) by Nerdfest on Monday June 08 2015, @02:38PM

      by Nerdfest (80) on Monday June 08 2015, @02:38PM (#193664)

      Perhaps it was where the 'operator' got annoyed at it driving so cautiously and wanted to speed a bit, pass one of those annoying people who tap their brakes constantly, or continually change speeds randomly on an open highway. For future automate vehicles I propose that they let the car do the driving while the human operates the weapons systems.

  • (Score: 1) by anubi on Tuesday June 09 2015, @12:54AM

    by anubi (2828) on Tuesday June 09 2015, @12:54AM (#193877) Journal

    intentional human malice is a huge problem

    That. sir, is my biggest apprehension.

    --
    "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]