Hackers can trick a Tesla into accelerating by 50 miles per hour:
This demonstration from the cybersecurity firm McAfee is the latest indication that adversarial machine learning can potentially wreck autonomous driving systems, presenting a security challenge to those hoping to commercialize the technology.
Mobileye EyeQ3 camera systems read speed limit signs and feed that information into autonomous driving features like Tesla's automatic cruise control, said Steve Povolny and Shivangee Trivedi from McAfee's Advanced Threat Research team.
The researchers stuck a tiny and nearly imperceptible sticker on a speed limit sign. The camera read the sign as 85 instead of 35, and in testing, both the 2016 Tesla Model X and that year's Model S sped up 50 miles per hour.
This is the latest in an increasing mountain of research showing how machine-learning systems can be attacked and fooled in life-threatening situations.
[...] Tesla has since moved to proprietary cameras on newer models, and Mobileye EyeQ3 has released several new versions of its cameras that in preliminary testing were not susceptible to this exact attack.
There are still a sizable number of Tesla cars operating with the vulnerable hardware, Povolny said. He pointed out that Teslas with the first version of hardware cannot be upgraded to newer hardware.
"What we're trying to do is we're really trying to raise awareness for both consumers and vendors of the types of flaws that are possible," Povolny said "We are not trying to spread fear and say that if you drive this car, it will accelerate into through a barrier, or to sensationalize it."
So, it seems this is not so much that a particular adversarial attack was successful (and fixed), but that it was but one instance of a potentially huge set. Obligatory xkcd.
Previously:
Protecting Smart Machines From Smart Attacks
A New Clothing Line Confuses Automated License Plate Readers
A Simple Sticker Tricked Neural Networks Into Classifying Anything as a Toaster
3D Printed Turtles Fool Google Image Classification Algorithm
Slight Street Sign Modifications Can Completely Fool Machine Learning Algorithms
(Score: 2) by Hyperturtle on Thursday February 20 2020, @03:19PM (5 children)
Wow, image recognition is fooled by vandalism of the image? Who knew? It's not like the cars are doing a careful analysis of previous speed signs and known traffic corridor speed rules and if there is construction and all sorts of things that a strict rule abiding system wouldn't expect when it came to established rules that are no subject to unanticipated changes.
Laws are based on trust. People that break the law are punished so that other people are incentivized to not do the same thing and hopefully recognize the value of a framework we can all trust and put our faith in.
Articles or studies that spell out for us that if we change a 3 into an 8 via whatever means, and the machine doing optical recognition sees it as an 8, this is no different than what I learned programming on my Commodore 64 -- Garbage in, Garbage Out.
Should every smart car system everywhere have a complete and full understanding of all speed limits everywhere in order to prevent situations like this?
Probably not. Should people be punished for vandalizing signs in such a way that people could get killed as a result? Absolutely. What type of punishment that should be, I don't know, but we're not supposed to hurt each other either and yet we have laws in place to punish the people that go around punching people.
Should the cars be limited to how fast they can go on non-highways? That sounds like a good idea, and anyone that wants their car to continue to have free speech or whatever they feel is violated can just grab the wheel again and press the accelerator before those options are taken from us later. Otherwise, the cars follow the laws and could employ a speed limit (imagine that!) based on more than just the signage--maybe some human can insert some logic into the process like "85 in a school zone is not permissible no matter what the signs state". GPS has certainly allowed for the mapping out of school zones, just ask Google...)
It is likely true that too much trust is being placed into a system that expects everything to follow the established rules. There are way too many variations of how rules can be broken to account for and program a solution to them all.
I think that a good part of the solution would be to properly incentivize people to not be jerks, punish the ones that do, and maybe for really important traffic corridors--have a means of updating the known speed limits in real time, which of course opens up its own can of worms considering some clown could broadcast fake speed limits or hack into the database and change the values or...
Security and trust will never be a resolvable issue for open systems like this that pretend they are closed, especially when it comes down to a person and their intentions from the outside of that system deciding to shoot paintballs at signs or cleverly design stickers to apply to various signs to change their meanings in order to, I don't know, generate ad revenue on their blog when they report more obvious things, like changing a 1 into a 7 increases something by in the same way changing a 3 to an 8 increases it by 5.
(Score: 0) by Anonymous Coward on Thursday February 20 2020, @06:43PM
" GPS has certainly allowed for the mapping out of school zones, just ask Google"
Yeah but sometimes the GPS gets confused and thinks you're on a freeway when you're not just because you're in close proximity to a freeway. Perhaps more should be done to protect against this as well (terrain mapping?).
(Score: 0) by Anonymous Coward on Thursday February 20 2020, @07:00PM (2 children)
There are much easier ways to be vandals these days as well. You can put some crazy obstacle on the freeway and that can cause all sorts of accidents. A bunch of nails or a spike strip or a piece of wood with nails in it. Imagine the things a vandal can do to train tracks to cause a train wreck. Yet somehow we manage to get along. We don't ban cars and trains because they can be vandalized. Otherwise everything would get banned.
I don't think the assumption being made should be that people will be vandals whenever possible. Sure there should be ways to catch vandals (the cars should have cameras, there should be security and cameras around, etc...) and punish them when they do get caught to deter them and make systems resistant to vandals but to not do anything because something can be vandalized would result in everyone living in a bubble. It wouldn't be practical.
(Score: 0) by Anonymous Coward on Thursday February 20 2020, @07:13PM (1 child)
My car is parked in a parking lot right now. Someone can take a rock and bash my windows in. OH NO!!! I should have walked to work!! LETS BAN CARS!!!!!
(Score: 0) by Anonymous Coward on Thursday February 20 2020, @07:22PM
Errr ... Let's *
(Score: 1) by speederaser on Friday February 21 2020, @01:29AM
My Garmin does that now without a camera or an internet connection. Always knows the speed limit everywhere I go, and whether an intersection is controlled by a light or stop signs. Shouldn't be a problem at all for smart cars.