Image recognition technology may be sophisticated, but it is also easily duped. Researchers have fooled algorithms into confusing two skiers for a dog, a baseball for espresso, and a turtle for a rifle. But a new method of deceiving the machines is simple and far-reaching, involving just a humble sticker.
Google researchers developed a psychedelic sticker that, when placed in an unrelated image, tricks deep learning systems into classifying the image as a toaster. According to a recently submitted research paper about the attack, this adversarial patch is "scene-independent," meaning someone could deploy it "without prior knowledge of the lighting conditions, camera angle, type of classifier being attacked, or even the other items within the scene." It's also easily accessible, given it can be shared and printed from the internet.
(Score: 1, Insightful) by Anonymous Coward on Wednesday January 10 2018, @05:08PM (7 children)
From https://gizmodo.com/this-simple-sticker-can-trick-neural-networks-into-thin-1821735479 [gizmodo.com]
(Score: 0) by Anonymous Coward on Wednesday January 10 2018, @05:19PM (2 children)
Makes me wonder. Will clothing and fashion accessories that defeat facial recognition become illegal because they could lead to death for occupants of autonomous vehicles?
Does that meant that machine vision is still not ready for self-driving cars?
Moreover, do we want to imagine a world where machine vision is ready for self-driving cars and cannot be fooled by clever clothing and fashion accessories?
(Score: 2) by bob_super on Wednesday January 10 2018, @05:28PM
"We need those self-driving cars, because they save lives! Therefore, all clothing is now banned!"
CA: Fine, man!
ND, WY: No self-driving cars!
FL: Self-driving vehicles allowed, but not near retired communities...
(Score: 0) by Anonymous Coward on Wednesday January 10 2018, @05:51PM
> Does that meant that machine vision is still not ready for self-driving cars?
All the might of the tech industry, brought to its knees by graffiti artists.
(Score: 2) by HiThere on Wednesday January 10 2018, @06:20PM
And that's why it's important that this stuff be done *now*, so the algorithms can be hardened before the cars become common.
Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
(Score: 2) by TheRaven on Thursday January 11 2018, @11:08AM (2 children)
sudo mod me up
(Score: 2) by Wootery on Friday January 12 2018, @01:30PM (1 child)
Two ideas spring to mind:
(Score: 3, Insightful) by TheRaven on Saturday January 13 2018, @03:17PM
That doesn't really help, because it assumes non-malicious mislabelling. It's analogous to error correction: ECC will protect you against all of the bit flips that are likely to occur accidentally, but if an attacker can flip a few bits intelligently then they can get past it.
That''s more likely, but it's very computationally expensive (even by machine-learning standards) and it has the same problem: an intelligent adversary is unlikely to pick the same possible variations as something that is not intelligently directed. Any machine learning approach gives you an approximation - the techniques are inherently unsuitable for producing anything else - and an intelligent adversary will always be able to find places where an approximation is wrong.
sudo mod me up