Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Friday February 03 2017, @01:51PM   Printer-friendly
from the calling-off-the-wedding dept.

Google's self-driving vehicles are "disengaging" less frequently on California roads:

California's Department of Motor Vehicles released its annual autonomous vehicle disengagement report today, in which all the companies that are actively testing self-driving cars on public roads in the Golden State disclose the number of times that human drivers were forced to take control of their driverless vehicles. The biggest news to come out of this report is from Waymo, Google's new self-driving car company, which reported a huge drop in disengagements in 2016 despite an almost equally huge increase in the number of miles driven. In other words, Waymo's self-driving cars are failing at a much lower rate, even as they are driving a whole lot more miles. The company says that since 2015, its rate of safety-related disengages has fallen from 0.8 per thousand miles to 0.2 per thousand miles in 2016. So while Waymo increased its driving by 50 percent in the state — racking up a total of 635,868 miles — the company's total number of reportable disengages fell from 341 in 2015 to 124.

"This four-fold improvement reflects the significant work we've been doing to make our software and hardware more capable and mature," Dmitri Dolgov, head of self-driving technology for Waymo, wrote in a blog post. "And because we're creating a self-driving car that can take you from door to door, almost all our time has been spent on complex urban or suburban streets. This has given us valuable experience sharing the road safely with pedestrians and cyclists, and practicing advanced maneuvers such as making unprotected left turns and traversing multi-lane intersections."

The majority of Waymo's disengagements were the result of "software glitches," the company says. "Unwanted maneuvers," "perception discrepancies," and "recklessly behaving road user" also accounted for dozens of disengagements. There were no reports of crashes or accidents.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by AthanasiusKircher on Friday February 03 2017, @07:40PM

    by AthanasiusKircher (5291) on Friday February 03 2017, @07:40PM (#462554) Journal

    I'm sure there's been significant improvements in this technology. But I recall from looking in detail at some of these reports in the past from Google that they don't provide data on when the drivers simply choose to drive themselves, other than perhaps a mileage summary. (And I recall that there was a rather large number of miles these cars were driven manually.)

    So, it opens this sort of data to reporting bias. Is it raining today or other unfavorable visibility or road conditions? Maybe the driver chooses to drive manually. Does the driver know there's an ongoing construction zone with unclear signs? Okay -- the driver just chooses to take over most of the trip.

    What we get are only reports when a driver was forced to take over due to an imminent AI problem. We don't get detailed reports on when drivers manually decide conditions aren't great for dealing with the AI, so they don't even engage it in the first place.

    Again, that's not saying AI in these vehicles isn't improving. But the problem with many of Google's stats is that if you look closely at their reports, you realize there's a lot of "manual" miles driven in these cars. (For example, buried in this report [googleusercontent.com], Google says over 15 months that the AI cars drove about 424k miles in autonomous mode, but 100k miles manually. How many of those 100k miles were driven manually to avoid AI problems in the first place is not reported.)

    So, it's a bit disingenuous to have these stats like "X million miles driven with no accidents" or whatever, when not only do drivers frequently disengage the AI when required to because the AI isn't responding correctly, but there could also be plenty of places or occasions when these cars aren't actually tested in conditions that real drivers might have to deal with (and where accidents may be more frequent, like inclement weather or unusual/unclear traffic patterns). It would be sort of equivalent to taking a new driver out with a learner's permit, but stopping the car and switching seats when it's time to go into heavy traffic or whatever. Obviously this is the cautious thing to do, and I'm not arguing the drivers should be testing in non-ideal conditions yet -- but I'm not sure how much we'd make of stats about "miles driven without an accident" in a situation like that either.

    Perhaps someone can find a recent report that addresses these issues. Back when I tried sorting through Google's data out of curiosity (several months ago at least), this was all left rather ambiguous.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by takyon on Saturday February 04 2017, @12:55AM

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Saturday February 04 2017, @12:55AM (#462681) Journal

    How many of those 100k miles were driven manually to avoid AI problems in the first place is not reported.

    If they aren't providing data about when and where the cars were driven, to allow the determination of weather and traffic conditions, then that figure won't be too helpful anyway.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]