Stories
Slash Boxes
Comments

SoylentNews is people

posted by chromas on Thursday May 24 2018, @11:23PM   Printer-friendly
from the IRQL_NOT_LESS_OR_EQUAL dept.

Uber ends self-driving operation in Arizona

Uber has shuttered its self-driving testing program in Arizona and laid off close to 300 workers there — most of them test drivers, or "vehicle operators" — two months after one of its autonomous cars killed a pedestrian, the company said on Wednesday. The company had been testing its self-driving technology in the state since 2016, but halted operations in the wake of the March crash. The company's testing was also indefinitely suspended by the Arizona governor's office.

[...] Uber says it still plans to restart its self-driving operations in other locations (like Pittsburgh or San Francisco) once the investigations into the Arizona crash are complete. But in those locations, Uber will "drive in a much more limited way," according to an internal email obtained by ArsTechnica.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Informative) by suburbanitemediocrity on Friday May 25 2018, @12:25AM (6 children)

    by suburbanitemediocrity (6844) on Friday May 25 2018, @12:25AM (#683825)

    I've spent my career developing safety critical systems (the kind where lots of people die if you screw up) and cannot fathom how this was allowed to occur.

    Other industries with century plus experience are highly regulated. These things are like engineering research projects (which I spent a couple years and published papers on). Where's the reliability safety analysis? For a passenger aircraft, the number is something like 1e-13 fatalities/hour. As much engineering goes into reliability as goes into developing the system.

    Starting Score:    1  point
    Moderation   +3  
       Interesting=1, Informative=2, Total=3
    Extra 'Informative' Modifier   0  

    Total Score:   4  
  • (Score: 2) by c0lo on Friday May 25 2018, @12:34AM (2 children)

    by c0lo (156) Subscriber Badge on Friday May 25 2018, @12:34AM (#683829) Journal

    Where's the reliability safety analysis? For a passenger aircraft, the number is something like 1e-13 fatalities/hour.

    But... how many passengers the self-driving cars killed? (large grin)

    --
    https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
    • (Score: 3, Insightful) by suburbanitemediocrity on Friday May 25 2018, @12:53AM (1 child)

      by suburbanitemediocrity (6844) on Friday May 25 2018, @12:53AM (#683833)

      The FAA is there to protect the people on the ground as much as in the plane if not more so.

      The passengers knew what they were getting into and willingly took a risk. The family at home under the flight path eating dinner had no say in the matter.

      • (Score: 2) by c0lo on Friday May 25 2018, @01:18AM

        by c0lo (156) Subscriber Badge on Friday May 25 2018, @01:18AM (#683837) Journal

        I'll let aside the ignored "(large grin)" (which would allow a 'whoosh')...

        The family at home under the flight path eating dinner had no say in the matter.

        ... and just interject here the ho [wikipedia.org]-hum [wikipedia.org].

        The FAA is there to protect the people on the ground as much as in the plane if not more so.

        My point for the ho-hum above? Nope, FAA is there as an organization that manage the safety rules and best practices in aviation (as in "best effort" and/or "steward of knowledge"), not a guarantor of safety for anyone in flight or under the flight path.

        --
        https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
  • (Score: 4, Informative) by takyon on Friday May 25 2018, @12:39AM

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Friday May 25 2018, @12:39AM (#683832) Journal

    cannot fathom how this was allowed to occur.

    First of all, just look at the company we're dealing with here:

    https://www.google.com/search?q=site:soylentnews.org+Uber&cad=h [google.com]

    Uber Leaves France [soylentnews.org]
    Uber's World Class Tax Avoidance Infrastructure [soylentnews.org]
    Uber Evaded Law Enforcement With "Greyball" [soylentnews.org]
    Text Messages Between Uber's Travis Kalanick and Anthony Levandowski Released [soylentnews.org]
    Real-Life Example of Uber's Regulator-Evading Software [soylentnews.org]
    Uber: We Don't Have to Pay Drivers Based on Rider Fares [soylentnews.org]
    Uber Paid Hackers to Delete Stolen Data on 57 Million People [soylentnews.org]
    SoftBank Knew of Data Breach at Uber [soylentnews.org]
    Uber Letter Alleges Surveillance on Politicians and Competitors [soylentnews.org]
    MIT Study Shows How Much Driving for Uber or Lyft Sucks [soylentnews.org]

    And of course, who can forget:

    The Fall of Uber CEO Travis Kalanick [soylentnews.org]

    Specifically related to Uber's crash:

    https://soylentnews.org/submit.pl?op=viewsub&subid=25554 [soylentnews.org]

    Waymo, formerly the self-driving car project of Google, said that in tests on roads in California last year, its cars went an average of nearly 5,600 miles before the driver had to take control from the computer to steer out of trouble. As of March, Uber was struggling to meet its target of 13 miles per “intervention” in Arizona, according to 100 pages of company documents obtained by The New York Times and two people familiar with the company’s operations in the Phoenix area but not permitted to speak publicly about it.

    Yet Uber’s test drivers were being asked to do more — going on solo runs when they had worked in pairs.

    [...] And there also was pressure to live up to a goal to offer a driverless car service by the end of the year and to impress top executives.

    [...] Unlike California, where Uber had been testing since spring of 2017, Arizona state officials had taken a hands-off approach to autonomous vehicles and did not require companies to disclose how their cars were performing.

    [...] When Uber moved to a single operator, some employees expressed safety concerns to managers, according to the two people familiar with Uber’s operations. They were worried that going solo would make it harder to remain alert during hours of monotonous driving.

    [...] Not all drivers followed Uber’s training. One was fired after falling asleep at the wheel and being spotted by a colleague. Another was spotted air drumming as the autonomous car passed through an intersection, according to the two people familiar with Uber’s operations.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
  • (Score: 2) by coolgopher on Friday May 25 2018, @03:18AM

    by coolgopher (1157) on Friday May 25 2018, @03:18AM (#683868)

    But you have to understand, if they'd left the emergency brake system online, the ride might not have been smooth enough for the passengers!

    Seriously, that's gotta incur some serious liability. One does not simply switch off the emergency brake system.

  • (Score: 1) by anubi on Friday May 25 2018, @07:19AM

    by anubi (2828) on Friday May 25 2018, @07:19AM (#683921) Journal

    I've spent my career developing safety critical systems...

    May I ask what is your preferred framework?

    I have been involved in similar stuff, and the only thing I really trust is Micrium's Micro C/OS on the ColdFire processor.

    But I got canned before I even started that one.

    ( They wanted me to use an OS I had no trust for... as I was afraid just one midnight "security update" would make ME look like a feeble coder incapable of designing robust code.)

    --
    "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]