Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Wednesday January 10 2018, @04:34PM   Printer-friendly
from the do-you-see-what-I-see? dept.

Image recognition technology may be sophisticated, but it is also easily duped. Researchers have fooled algorithms into confusing two skiers for a dog, a baseball for espresso, and a turtle for a rifle. But a new method of deceiving the machines is simple and far-reaching, involving just a humble sticker.

Google researchers developed a psychedelic sticker that, when placed in an unrelated image, tricks deep learning systems into classifying the image as a toaster. According to a recently submitted research paper about the attack, this adversarial patch is "scene-independent," meaning someone could deploy it "without prior knowledge of the lighting conditions, camera angle, type of classifier being attacked, or even the other items within the scene." It's also easily accessible, given it can be shared and printed from the internet.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Wednesday January 10 2018, @05:03PM (1 child)

    by Anonymous Coward on Wednesday January 10 2018, @05:03PM (#620517)

    It's actually toasters all the way down. Time to re-build all the ancient Hindu statues that depict the earth supported on turtles...

    https://en.wikipedia.org/wiki/World_Turtle [wikipedia.org]

  • (Score: 3, Funny) by c0lo on Wednesday January 10 2018, @10:41PM

    by c0lo (156) Subscriber Badge on Wednesday January 10 2018, @10:41PM (#620691) Journal

    Time to re-build all the ancient Hindu statues that depict the earth supported on rifles [gizmodo.com]...

    FTFY

    --
    https://www.youtube.com/watch?v=aoFiw2jMy-0