from the GIGO dept.
Garments from Adversarial Fashion feed junk data into surveillance cameras, in an effort to make their databases less effective.
The news: Hacker and designer Kate Rose unveiled the new range of clothing at the DefCon cybersecurity conference in Las Vegas. In a talk, she explained the that hoodies, shirts, dresses, and skirts trigger automated license plate readers (ALPRs) to inject useless data into systems used to track civilians.
False tags: The license-plate-like designs on a garment are picked up and recorded as vehicles by readers, which frequently misclassify images like fences as license plates anyway, according to Rose (pictured above modeling one of her dresses). The idea is that feeding more junk data into the systems will make them less effective at tracking people and more expensive to deploy.
[...] Fashion fights back: Though it's the first to target ALPRs, this isn't the first fashion project aimed at fighting back against surveillance. Researchers have come up with adversarial images on clothing aimed at bamboozling AI, makeup that lets you hide your face from recognition systems, and even a hat that can trick systems into thinking you're Moby.
This demonstration from the cybersecurity firm McAfee is the latest indication that adversarial machine learning can potentially wreck autonomous driving systems, presenting a security challenge to those hoping to commercialize the technology.
Mobileye EyeQ3 camera systems read speed limit signs and feed that information into autonomous driving features like Tesla's automatic cruise control, said Steve Povolny and Shivangee Trivedi from McAfee's Advanced Threat Research team.
The researchers stuck a tiny and nearly imperceptible sticker on a speed limit sign. The camera read the sign as 85 instead of 35, and in testing, both the 2016 Tesla Model X and that year's Model S sped up 50 miles per hour.
This is the latest in an increasing mountain of research showing how machine-learning systems can be attacked and fooled in life-threatening situations.
[...] Tesla has since moved to proprietary cameras on newer models, and Mobileye EyeQ3 has released several new versions of its cameras that in preliminary testing were not susceptible to this exact attack.
There are still a sizable number of Tesla cars operating with the vulnerable hardware, Povolny said. He pointed out that Teslas with the first version of hardware cannot be upgraded to newer hardware.
"What we're trying to do is we're really trying to raise awareness for both consumers and vendors of the types of flaws that are possible," Povolny said "We are not trying to spread fear and say that if you drive this car, it will accelerate into through a barrier, or to sensationalize it."
So, it seems this is not so much that a particular adversarial attack was successful (and fixed), but that it was but one instance of a potentially huge set. Obligatory xkcd.