Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Wednesday February 19 2020, @09:45PM   Printer-friendly
from the Do-these-trick-other-vendor's-systems? dept.

Hackers can trick a Tesla into accelerating by 50 miles per hour:

This demonstration from the cybersecurity firm McAfee is the latest indication that adversarial machine learning can potentially wreck autonomous driving systems, presenting a security challenge to those hoping to commercialize the technology.

Mobileye EyeQ3 camera systems read speed limit signs and feed that information into autonomous driving features like Tesla's automatic cruise control, said Steve Povolny and Shivangee Trivedi from McAfee's Advanced Threat Research team.

The researchers stuck a tiny and nearly imperceptible sticker on a speed limit sign. The camera read the sign as 85 instead of 35, and in testing, both the 2016 Tesla Model X and that year's Model S sped up 50 miles per hour.

This is the latest in an increasing mountain of research showing how machine-learning systems can be attacked and fooled in life-threatening situations.

[...] Tesla has since moved to proprietary cameras on newer models, and Mobileye EyeQ3 has released several new versions of its cameras that in preliminary testing were not susceptible to this exact attack.

There are still a sizable number of Tesla cars operating with the vulnerable hardware, Povolny said. He pointed out that Teslas with the first version of hardware cannot be upgraded to newer hardware.

"What we're trying to do is we're really trying to raise awareness for both consumers and vendors of the types of flaws that are possible," Povolny said "We are not trying to spread fear and say that if you drive this car, it will accelerate into through a barrier, or to sensationalize it."

So, it seems this is not so much that a particular adversarial attack was successful (and fixed), but that it was but one instance of a potentially huge set. Obligatory xkcd.


Original Submission

Previously:
Protecting Smart Machines From Smart Attacks
A New Clothing Line Confuses Automated License Plate Readers
A Simple Sticker Tricked Neural Networks Into Classifying Anything as a Toaster
3D Printed Turtles Fool Google Image Classification Algorithm
Slight Street Sign Modifications Can Completely Fool Machine Learning Algorithms

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by bzipitidoo on Thursday February 20 2020, @01:58AM

    by bzipitidoo (4388) Subscriber Badge on Thursday February 20 2020, @01:58AM (#960132) Journal

    One time coming back from a long road trip of 2000 miles, I was stunned to discover that the last 10 miles, on a road I know well but hadn't used in a few years, was by far the hardest to drive. It was a country road around which housing developments had recently sprung up. Lot of road work was in progress to turn it from a 2 lane highway into a 6 lane street. They had lane shifts and barrels and changes of pavement. Several times, the single lane would split into 2, and it was not clear whether the through lane was the right lane with a left turn lane, or the left lane with a right turn lane. Further, it had gotten dark, and all these newly opened stores had hastily erected illumination that was not well positioned, so that many of the lights were shining in the drivers' faces. Bad enough being blinded by oncoming traffic, without having to deal with that. To add to the fun, it was a bad kind of busy, with a lot more oncoming traffic than traffic on my side, which meant no one to follow through the maze, and lots of headlights glaring in my face.

    Texas does a bad job of directing traffic in road construction zones. I wonder how well these self-driving cars would handle a hell drive like that one.

    There are worse drives. Try a 100 mile trip in winter weather, in those hours just after a blizzard and before the snowplows have had a chance to clear the roads. If you can get through at all, you may be doing stuff like ramming your way through snowdrifts. Back up, get a running start, and plow into the snow. Gets you a few feet of progress. Then back up and do it again. And again, and again. For hours. Can't be too deep, for that to work. If the road is buried under 8 feet of snow for miles, you are not ramming through that with a car, you are stuck until the snowplows clear the road. Then you get the thrill of driving in a snow canyon. I've never done any of that, but in his younger days, my father did, many times. I always wondered what was so urgent that he couldn't wait until the next day. In later years, he did wait. Anyway, would like to see an AI handle that.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2