Slash Boxes

SoylentNews is people

posted by martyb on Wednesday February 19 2020, @09:45PM   Printer-friendly
from the Do-these-trick-other-vendor's-systems? dept.

Hackers can trick a Tesla into accelerating by 50 miles per hour:

This demonstration from the cybersecurity firm McAfee is the latest indication that adversarial machine learning can potentially wreck autonomous driving systems, presenting a security challenge to those hoping to commercialize the technology.

Mobileye EyeQ3 camera systems read speed limit signs and feed that information into autonomous driving features like Tesla's automatic cruise control, said Steve Povolny and Shivangee Trivedi from McAfee's Advanced Threat Research team.

The researchers stuck a tiny and nearly imperceptible sticker on a speed limit sign. The camera read the sign as 85 instead of 35, and in testing, both the 2016 Tesla Model X and that year's Model S sped up 50 miles per hour.

This is the latest in an increasing mountain of research showing how machine-learning systems can be attacked and fooled in life-threatening situations.

[...] Tesla has since moved to proprietary cameras on newer models, and Mobileye EyeQ3 has released several new versions of its cameras that in preliminary testing were not susceptible to this exact attack.

There are still a sizable number of Tesla cars operating with the vulnerable hardware, Povolny said. He pointed out that Teslas with the first version of hardware cannot be upgraded to newer hardware.

"What we're trying to do is we're really trying to raise awareness for both consumers and vendors of the types of flaws that are possible," Povolny said "We are not trying to spread fear and say that if you drive this car, it will accelerate into through a barrier, or to sensationalize it."

So, it seems this is not so much that a particular adversarial attack was successful (and fixed), but that it was but one instance of a potentially huge set. Obligatory xkcd.

Original Submission

Protecting Smart Machines From Smart Attacks
A New Clothing Line Confuses Automated License Plate Readers
A Simple Sticker Tricked Neural Networks Into Classifying Anything as a Toaster
3D Printed Turtles Fool Google Image Classification Algorithm
Slight Street Sign Modifications Can Completely Fool Machine Learning Algorithms

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Thursday February 20 2020, @07:00PM (2 children)

    by Anonymous Coward on Thursday February 20 2020, @07:00PM (#960407)

    There are much easier ways to be vandals these days as well. You can put some crazy obstacle on the freeway and that can cause all sorts of accidents. A bunch of nails or a spike strip or a piece of wood with nails in it. Imagine the things a vandal can do to train tracks to cause a train wreck. Yet somehow we manage to get along. We don't ban cars and trains because they can be vandalized. Otherwise everything would get banned.

    I don't think the assumption being made should be that people will be vandals whenever possible. Sure there should be ways to catch vandals (the cars should have cameras, there should be security and cameras around, etc...) and punish them when they do get caught to deter them and make systems resistant to vandals but to not do anything because something can be vandalized would result in everyone living in a bubble. It wouldn't be practical.

  • (Score: 0) by Anonymous Coward on Thursday February 20 2020, @07:13PM (1 child)

    by Anonymous Coward on Thursday February 20 2020, @07:13PM (#960409)

    My car is parked in a parking lot right now. Someone can take a rock and bash my windows in. OH NO!!! I should have walked to work!! LETS BAN CARS!!!!!

    • (Score: 0) by Anonymous Coward on Thursday February 20 2020, @07:22PM

      by Anonymous Coward on Thursday February 20 2020, @07:22PM (#960412)

      Errr ... Let's *