Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Wednesday January 10 2018, @04:34PM   Printer-friendly
from the do-you-see-what-I-see? dept.

Image recognition technology may be sophisticated, but it is also easily duped. Researchers have fooled algorithms into confusing two skiers for a dog, a baseball for espresso, and a turtle for a rifle. But a new method of deceiving the machines is simple and far-reaching, involving just a humble sticker.

Google researchers developed a psychedelic sticker that, when placed in an unrelated image, tricks deep learning systems into classifying the image as a toaster. According to a recently submitted research paper about the attack, this adversarial patch is "scene-independent," meaning someone could deploy it "without prior knowledge of the lighting conditions, camera angle, type of classifier being attacked, or even the other items within the scene." It's also easily accessible, given it can be shared and printed from the internet.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1) by starvingboy on Wednesday January 10 2018, @04:56PM (12 children)

    by starvingboy (6766) on Wednesday January 10 2018, @04:56PM (#620510)

    I wonder if they'll come up with something similar for facial recognition software. It'd be disabled tracking via outerwear.

  • (Score: 3, Informative) by Snow on Wednesday January 10 2018, @05:00PM (3 children)

    by Snow (1601) on Wednesday January 10 2018, @05:00PM (#620513) Journal

    Like a ski mask?

    • (Score: 2) by ese002 on Wednesday January 10 2018, @10:50PM (1 child)

      by ese002 (5306) on Wednesday January 10 2018, @10:50PM (#620695)

      Like a ski mask?

      Wearing a ski mask blocks facial recognition but will make you highly conspicuous. All the humans will be watching you and, likely, so will the cameras.

      A suitable adversarial patch might defeat facial recognition while still being inconspicuous to humans.

      • (Score: 2) by fliptop on Thursday January 11 2018, @05:13AM

        by fliptop (1666) on Thursday January 11 2018, @05:13AM (#620807) Journal

        Wearing a ski mask...will make you highly conspicuous.

        It's also illegal [thedaonline.com] in some states.

        --
        It's crackers to slip a rozzer the dropsy in snide.
    • (Score: 2) by requerdanos on Thursday January 11 2018, @12:57AM

      by requerdanos (5997) Subscriber Badge on Thursday January 11 2018, @12:57AM (#620741) Journal

      Like a ski mask?

      Well, no, not like that.

      A ski mask covers almost 100% of the face, and even afterwards, it's still completely identifiable as a face. Zoom in enough and you can even get an identifying retina scan.

      This isn't like that. Think about the differences between a ski mask (covers almost all the face and still, it's clearly a face) vs. our sticker (covers none of the whatever and still gets it misidentified as something completely different):

      - The sticker does not have to cover up the banana (or other subject) to get the image classified as a toaster.
      - The sticker does not have to cover up even a very large percentage of the image to get the image classified as a toaster.
      - The sticker does not have to convincingly depict a toaster to get the image classified as a toaster.

      Heck, you could probably wear reflective sunglasses with the toaster sticker design printed on each lens and get identified as a tall, mobile toaster. If you lived in the movie Minority Report, in fact, I'd recommend it, so department stores aren't constantly asking how you liked those jeans, Mr. Yakamoto?

    • (Score: 1) by fustakrakich on Wednesday January 10 2018, @05:51PM (1 child)

      by fustakrakich (6150) on Wednesday January 10 2018, @05:51PM (#620541) Journal

      Oh no, you got it all wrong [wordpress.com]...

      *are hashtags capitalized?

      --
      Ok, we paid the ransom. Do I get my dog back? REDЯUM
      • (Score: 0) by Anonymous Coward on Thursday January 11 2018, @02:44AM

        by Anonymous Coward on Thursday January 11 2018, @02:44AM (#620776)
        You should use a Trump mask and maybe #MAGA and neo-nazi stuff. The media has been rabidly trying to pin anything and everything on Trump. So there'll be a higher chance of the resulting stories being about Trump and not about what you actually did and who you are ;).
    • (Score: 2) by TheRaven on Thursday January 11 2018, @11:04AM

      by TheRaven (270) on Thursday January 11 2018, @11:04AM (#620877) Journal
      Those obscure your face entirely, and look suspicious. The goal is to have something that is ignored by humans, but makes face recognition algorithms either not classify you as a thing with a face at all, or classify you as some innocuous face. Because most of these deep learning buzzwordy systems are correlation engines working on unknown parameters, it's often possible to alter something that they detect in such a way that you get a completely different result, with minimal changes to the input (last year there was a single-pixel change to an image that caused Google's system to recognise a car as a dog or vice versa, for example).
      --
      sudo mod me up
  • (Score: 2) by bob_super on Wednesday January 10 2018, @05:25PM (1 child)

    by bob_super (1357) on Wednesday January 10 2018, @05:25PM (#620528)

    I'm already in contact with cap manufacturers to massively print that pattern.
    $50 for the Safe From Facial Tracking caps. Gonna be Huuge in China. I'll call you to confirm whether the tinfoil hat people make be a billionaire or sue...

  • (Score: 2) by LoRdTAW on Wednesday January 10 2018, @07:28PM

    by LoRdTAW (3755) on Wednesday January 10 2018, @07:28PM (#620586) Journal
  • (Score: 2) by sgleysti on Thursday January 11 2018, @02:57AM

    by sgleysti (56) on Thursday January 11 2018, @02:57AM (#620780)

    Like the ugly shirt in Gibson's novel "Zero History".