Stories
Slash Boxes
Comments

SoylentNews is people

posted by mrpg on Friday February 02 2018, @09:00PM   Printer-friendly
from the what-could-possibly-go-rwong dept.

2,304 times last year (in just one state) so-called self driving cars, ahem, didn't self-drive, according to this report at auto connected car news.

The technology is not safe unless it is monitored by a human behind a steering wheel who can take control, Consumer Watchdog said.

Reasons for disengagement include:
    [a lot of human factors -- which "AI" does not understand]
        * Hardware discrepancy.
        * Errors in detection.
        * GPS signal issues.
        * Software crash.

While 50 companies are licensed to test autonomous vehicles in California, only 19 companies were required to file disengagement reports covering 2017.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by Thexalon on Saturday February 03 2018, @05:20PM

    by Thexalon (636) on Saturday February 03 2018, @05:20PM (#632604)

    If a supposedly autonomous vehicle can't drive safely and sanely without GPS info, I'm just going to go ahead and say it can't drive at all.

    I'm guessing the primary use of the GPS info is to figure out where it's supposed to be trying to go. The rest of the programming can keep the car in its lane at a reasonable speed, but can't always tell you if you're trying to take the next exit ramp or turn left at the next stop sign (especially if the streets aren't well-marked, a fairly common occurrence).

    --
    The only thing that stops a bad guy with a compiler is a good guy with a compiler.
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2