Image recognition technology may be sophisticated, but it is also easily duped. Researchers have fooled algorithms into confusing two skiers for a dog, a baseball for espresso, and a turtle for a rifle. But a new method of deceiving the machines is simple and far-reaching, involving just a humble sticker.
Google researchers developed a psychedelic sticker that, when placed in an unrelated image, tricks deep learning systems into classifying the image as a toaster. According to a recently submitted research paper about the attack, this adversarial patch is "scene-independent," meaning someone could deploy it "without prior knowledge of the lighting conditions, camera angle, type of classifier being attacked, or even the other items within the scene." It's also easily accessible, given it can be shared and printed from the internet.
(Score: 2) by requerdanos on Thursday January 11 2018, @12:57AM
Well, no, not like that.
A ski mask covers almost 100% of the face, and even afterwards, it's still completely identifiable as a face. Zoom in enough and you can even get an identifying retina scan.
This isn't like that. Think about the differences between a ski mask (covers almost all the face and still, it's clearly a face) vs. our sticker (covers none of the whatever and still gets it misidentified as something completely different):
- The sticker does not have to cover up the banana (or other subject) to get the image classified as a toaster.
- The sticker does not have to cover up even a very large percentage of the image to get the image classified as a toaster.
- The sticker does not have to convincingly depict a toaster to get the image classified as a toaster.
Heck, you could probably wear reflective sunglasses with the toaster sticker design printed on each lens and get identified as a tall, mobile toaster. If you lived in the movie Minority Report, in fact, I'd recommend it, so department stores aren't constantly asking how you liked those jeans, Mr. Yakamoto?