Image recognition technology may be sophisticated, but it is also easily duped. Researchers have fooled algorithms into confusing two skiers for a dog, a baseball for espresso, and a turtle for a rifle. But a new method of deceiving the machines is simple and far-reaching, involving just a humble sticker.
Google researchers developed a psychedelic sticker that, when placed in an unrelated image, tricks deep learning systems into classifying the image as a toaster. According to a recently submitted research paper about the attack, this adversarial patch is "scene-independent," meaning someone could deploy it "without prior knowledge of the lighting conditions, camera angle, type of classifier being attacked, or even the other items within the scene." It's also easily accessible, given it can be shared and printed from the internet.
(Score: 5, Funny) by SomeGuy on Wednesday January 10 2018, @05:26PM (1 child)
I knew I shouldn't have left the AfterDark Flying Toasters screen saver running on the Neural Network computers.
(Score: 0) by Anonymous Coward on Wednesday January 10 2018, @06:17PM
In high school I spent my rebellious teen years violating Galactic Ordinance 729.881-ZT13: accelerating toasters to unsafe speeds.
Mr. Glitch's Retro Reviews: [After Dark] Lunatic Fringe [blogspot.com]