Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Friday February 03 2017, @01:51PM   Printer-friendly
from the calling-off-the-wedding dept.

Google's self-driving vehicles are "disengaging" less frequently on California roads:

California's Department of Motor Vehicles released its annual autonomous vehicle disengagement report today, in which all the companies that are actively testing self-driving cars on public roads in the Golden State disclose the number of times that human drivers were forced to take control of their driverless vehicles. The biggest news to come out of this report is from Waymo, Google's new self-driving car company, which reported a huge drop in disengagements in 2016 despite an almost equally huge increase in the number of miles driven. In other words, Waymo's self-driving cars are failing at a much lower rate, even as they are driving a whole lot more miles. The company says that since 2015, its rate of safety-related disengages has fallen from 0.8 per thousand miles to 0.2 per thousand miles in 2016. So while Waymo increased its driving by 50 percent in the state — racking up a total of 635,868 miles — the company's total number of reportable disengages fell from 341 in 2015 to 124.

"This four-fold improvement reflects the significant work we've been doing to make our software and hardware more capable and mature," Dmitri Dolgov, head of self-driving technology for Waymo, wrote in a blog post. "And because we're creating a self-driving car that can take you from door to door, almost all our time has been spent on complex urban or suburban streets. This has given us valuable experience sharing the road safely with pedestrians and cyclists, and practicing advanced maneuvers such as making unprotected left turns and traversing multi-lane intersections."

The majority of Waymo's disengagements were the result of "software glitches," the company says. "Unwanted maneuvers," "perception discrepancies," and "recklessly behaving road user" also accounted for dozens of disengagements. There were no reports of crashes or accidents.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by fishybell on Friday February 03 2017, @05:52PM

    by fishybell (3156) on Friday February 03 2017, @05:52PM (#462501)

    If we include everything under "software glitch," what exactly would we learn from these reports?

    Unless people can fall asleep drunk and get to their final destination nearly 100% of the time, people won't trust self-driving cars.

    Learning about specific types of problems helps ease tensions, but if they didn't group them into somewhat vague categories, you'd have 1 category per incident. Instead of 15 "the car didn't do what I expected" you'd have 15 different "the car did X when I expected Y" reports.

    I'm guessing "software glitch" refers to the software crashing or freezing. To me, that's something I would never be okay with.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 1) by Scruffy Beard 2 on Friday February 03 2017, @06:31PM

    by Scruffy Beard 2 (6030) on Friday February 03 2017, @06:31PM (#462522)

    Hey, I freeze myself in traffic sometimes. Do it too long, you get honked at.