Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Wednesday February 19 2020, @09:45PM   Printer-friendly
from the Do-these-trick-other-vendor's-systems? dept.

Hackers can trick a Tesla into accelerating by 50 miles per hour:

This demonstration from the cybersecurity firm McAfee is the latest indication that adversarial machine learning can potentially wreck autonomous driving systems, presenting a security challenge to those hoping to commercialize the technology.

Mobileye EyeQ3 camera systems read speed limit signs and feed that information into autonomous driving features like Tesla's automatic cruise control, said Steve Povolny and Shivangee Trivedi from McAfee's Advanced Threat Research team.

The researchers stuck a tiny and nearly imperceptible sticker on a speed limit sign. The camera read the sign as 85 instead of 35, and in testing, both the 2016 Tesla Model X and that year's Model S sped up 50 miles per hour.

This is the latest in an increasing mountain of research showing how machine-learning systems can be attacked and fooled in life-threatening situations.

[...] Tesla has since moved to proprietary cameras on newer models, and Mobileye EyeQ3 has released several new versions of its cameras that in preliminary testing were not susceptible to this exact attack.

There are still a sizable number of Tesla cars operating with the vulnerable hardware, Povolny said. He pointed out that Teslas with the first version of hardware cannot be upgraded to newer hardware.

"What we're trying to do is we're really trying to raise awareness for both consumers and vendors of the types of flaws that are possible," Povolny said "We are not trying to spread fear and say that if you drive this car, it will accelerate into through a barrier, or to sensationalize it."

So, it seems this is not so much that a particular adversarial attack was successful (and fixed), but that it was but one instance of a potentially huge set. Obligatory xkcd.


Original Submission

Previously:
Protecting Smart Machines From Smart Attacks
A New Clothing Line Confuses Automated License Plate Readers
A Simple Sticker Tricked Neural Networks Into Classifying Anything as a Toaster
3D Printed Turtles Fool Google Image Classification Algorithm
Slight Street Sign Modifications Can Completely Fool Machine Learning Algorithms

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Informative) by Anonymous Coward on Wednesday February 19 2020, @10:38PM (4 children)

    by Anonymous Coward on Wednesday February 19 2020, @10:38PM (#960046)

    most people would probably see 85 too

    Yes. But people aren't AI, they are real-intelligent: They will pick up that it is unreasonable for the speed limit to suddenly go from 35 to 85 on a little street, and will guess someone fucked with the sign.

    Starting Score:    0  points
    Moderation   +3  
       Informative=2, Underrated=1, Total=3
    Extra 'Informative' Modifier   0  

    Total Score:   3  
  • (Score: 2, Insightful) by Ethanol-fueled on Wednesday February 19 2020, @10:52PM (1 child)

    by Ethanol-fueled (2792) on Wednesday February 19 2020, @10:52PM (#960057) Homepage

    The same assholes driving in these cars are the same assholes watching Harry Potter with self-driving enabled and totally oblivious to the outside world. They ain't gonna notice shit until they get punched in the face with an airbag and then locked inside to die in a fiery inferno.

    • (Score: 3, Funny) by c0lo on Wednesday February 19 2020, @11:11PM

      by c0lo (156) on Wednesday February 19 2020, @11:11PM (#960070) Journal

      The same assholes driving being driven in these cars...

      FTFY

      They ain't gonna notice shit until they get punched in the face with an airbag and then locked inside to die in a fiery inferno.

      I would totally pay more taxes if I knew they are gonna be used for installing public, free of charge, "asshole punching and incinerating" facilities where this can happen without endangering the rest of decent humans.

      (who am I kidding, tho', the "decent human" race got extinct. Probably with Neanderthals)

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0
  • (Score: 1) by anubi on Thursday February 20 2020, @05:47AM

    by anubi (2828) on Thursday February 20 2020, @05:47AM (#960207) Journal

    This is a good example of coding skill.

    One can be quite nebulous in telling a human to do something, and still get the point across.

    One can have all sorts of leadership skills when working with people, but this rarely works with state machines. If the company depends on their machines to work, a good coder and engineer is worth a helluva lot. But in reality the best often get poor reviews over people skills, as they are not a people person at heart...they deal with machines.

    Machines do exactly what they are told to do...not what they think you want. One does not impress a machine by having a corner office, private jet, three piece suit, and a pad of evaluation forms.

    --
    "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
  • (Score: 0) by Anonymous Coward on Friday February 21 2020, @12:33AM

    by Anonymous Coward on Friday February 21 2020, @12:33AM (#960517)

    And once we move to all autonomous cars, governments will simply have to publish maps in a standard format so the car can look up the correct limits without having to try to parse some marks painted on a piece of metal. Then the car will have access to better information than the human. The virtue of human intuition in knowing not to drive fast in a residential zone only arises from the fact that we are in a transition period where the information is mainly being made for human consumption.