Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 14 submissions in the queue.
posted by Fnord666 on Friday February 03 2017, @01:51PM   Printer-friendly
from the calling-off-the-wedding dept.

Google's self-driving vehicles are "disengaging" less frequently on California roads:

California's Department of Motor Vehicles released its annual autonomous vehicle disengagement report today, in which all the companies that are actively testing self-driving cars on public roads in the Golden State disclose the number of times that human drivers were forced to take control of their driverless vehicles. The biggest news to come out of this report is from Waymo, Google's new self-driving car company, which reported a huge drop in disengagements in 2016 despite an almost equally huge increase in the number of miles driven. In other words, Waymo's self-driving cars are failing at a much lower rate, even as they are driving a whole lot more miles. The company says that since 2015, its rate of safety-related disengages has fallen from 0.8 per thousand miles to 0.2 per thousand miles in 2016. So while Waymo increased its driving by 50 percent in the state — racking up a total of 635,868 miles — the company's total number of reportable disengages fell from 341 in 2015 to 124.

"This four-fold improvement reflects the significant work we've been doing to make our software and hardware more capable and mature," Dmitri Dolgov, head of self-driving technology for Waymo, wrote in a blog post. "And because we're creating a self-driving car that can take you from door to door, almost all our time has been spent on complex urban or suburban streets. This has given us valuable experience sharing the road safely with pedestrians and cyclists, and practicing advanced maneuvers such as making unprotected left turns and traversing multi-lane intersections."

The majority of Waymo's disengagements were the result of "software glitches," the company says. "Unwanted maneuvers," "perception discrepancies," and "recklessly behaving road user" also accounted for dozens of disengagements. There were no reports of crashes or accidents.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Interesting) by maxwell demon on Friday February 03 2017, @03:19PM

    by maxwell demon (1608) on Friday February 03 2017, @03:19PM (#462408) Journal

    "Unwanted maneuvers," "perception discrepancies," and "recklessly behaving road user" also accounted for dozens of disengagements. There were no reports of crashes or accidents.

    What exactly do those terms mean here?

    "Unwanted maneuvers" — Did Google's car do an unwanted maneuver? And if so, what is the difference to a "software glitch"? And otherwise, what does qualify as "unwanted maneuver" of other people? "We'd prefer that they wouldn't do it, as it is hard for our software to handle"?

    "perception discrepancies" — What is this supposed to mean? That the car was right, but the human was wrong? Because otherwise, wouldn't it be, again, a software glitch?

    "recklessly behaving road user" — I hope that one doesn't refer to the autopilot … although I guess that would find customers, too ;-)

    --
    The Tao of math: The numbers you can count are not the real numbers.
    Starting Score:    1  point
    Moderation   +2  
       Interesting=2, Total=2
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 2) by takyon on Friday February 03 2017, @03:34PM

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Friday February 03 2017, @03:34PM (#462425) Journal

    "recklessly behaving road user" — I hope that one doesn't refer to the autopilot … although I guess that would find customers, too ;-)

    An automated ride and a blowjob? Sex bots that transform into cars? [youtube.com]

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
  • (Score: 2) by ikanreed on Friday February 03 2017, @04:40PM

    by ikanreed (3164) Subscriber Badge on Friday February 03 2017, @04:40PM (#462466) Journal

    I would think the meanings are pretty clear:

    Unwanted maneuvers refers to the car doing something the human monitor thinks it ought not to: crossing lanes, turning unexpectedly, running red lights, speeding, whatever.
    Perception discrepancies are when the camera goes "Oh shit there's something there" and the lidar goes "I don't see it" and the car turns over to the human.
    Recklessly behaving road user definitely refers to people doing dangerous shit on the road, and the car doesn't know how to predict their future behavior and avoid running into them.

    • (Score: 2) by Arik on Friday February 03 2017, @05:06PM

      by Arik (4543) on Friday February 03 2017, @05:06PM (#462479) Journal
      "Unwanted maneuvers refers to the car doing something the human monitor thinks it ought not to: crossing lanes, turning unexpectedly, running red lights, speeding, whatever."

      Except that would already be accounted for under software glitch.
      --
      If laughter is the best medicine, who are the best doctors?
      • (Score: 2) by ikanreed on Friday February 03 2017, @05:33PM

        by ikanreed (3164) Subscriber Badge on Friday February 03 2017, @05:33PM (#462490) Journal

        Only if we define "Glitch" in a way that is identical to "Something other than what the driver wants".

        Which is a perfectly acceptable definition, but to a programmer, a glitch is usually more akin to an actual programming error, rather than a disconnect between requirements and what the user actually wants.

      • (Score: 2) by fishybell on Friday February 03 2017, @05:52PM

        by fishybell (3156) on Friday February 03 2017, @05:52PM (#462501)

        If we include everything under "software glitch," what exactly would we learn from these reports?

        Unless people can fall asleep drunk and get to their final destination nearly 100% of the time, people won't trust self-driving cars.

        Learning about specific types of problems helps ease tensions, but if they didn't group them into somewhat vague categories, you'd have 1 category per incident. Instead of 15 "the car didn't do what I expected" you'd have 15 different "the car did X when I expected Y" reports.

        I'm guessing "software glitch" refers to the software crashing or freezing. To me, that's something I would never be okay with.

        • (Score: 1) by Scruffy Beard 2 on Friday February 03 2017, @06:31PM

          by Scruffy Beard 2 (6030) on Friday February 03 2017, @06:31PM (#462522)

          Hey, I freeze myself in traffic sometimes. Do it too long, you get honked at.

  • (Score: 2) by DeathMonkey on Friday February 03 2017, @06:37PM

    by DeathMonkey (1380) on Friday February 03 2017, @06:37PM (#462525) Journal

    What exactly do those terms mean here?
    "Unwanted maneuvers" — Did Google's car do an unwanted maneuver?

    Yes. If you look at the report the cause is actually "disengage for unwanted maneuver of the vehicle" so the causes in the summary are excerpted a bit.

    I'd copy/paste them here but it's a scanned report...