Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Wednesday July 01 2015, @05:29AM   Printer-friendly
from the did-you-really-just-call-her-that? dept.

Google Photo tries to categorize your pictures automatically. Until very recently, it had a failure mode in which its classification for some pictures of humans was "Gorillas".

Google reacted [and apologised] very quickly when they got a complaint from a black woman who had been misclassified.

When Brooklyn-based computer programmer Jacky Alcine looked over a set of images that he had uploaded to Google Photos on Sunday, he found that the service had attempted to classify them according to their contents. Google offers this capability as a selling point of its service, boasting that it lets you, “Search by what you remember about a photo, no description needed.” In Alcine’s case, many of those labels were basically accurate: A photograph of an airplane wing had been filed under “Airplanes,” one of two tall buildings under “Skyscrapers,” and so on.

Then there was a picture of Alcine and a friend. They’re both black. And Google had labeled the photo “Gorillas.” On investigation, Alcine found that many more photographs of the pair—and nothing else—had been placed under this literally dehumanizing rubric.

Speculating, it's possible that their software is heavy on statistical matching and it's really hard to debug, which is why they wound up simply deleting "Gorilla" from the list of possible categories.

http://www.slate.com/blogs/future_tense/2015/06/30/google_s_image_recognition_software_returns_some_surprisingly_racist_results.html


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Insightful) by GoonDu on Wednesday July 01 2015, @05:48AM

    by GoonDu (2623) on Wednesday July 01 2015, @05:48AM (#203636)

    What if the person in the photograph do look like a gorilla? On a serious note, if an image recognition algorithm misclassifies, I would not go as far as to cry 'racism'. Just simply a mistake.

    Starting Score:    1  point
    Moderation   +1  
       Insightful=1, Total=1
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 1) by Pino P on Wednesday July 01 2015, @01:53PM

    by Pino P (4721) on Wednesday July 01 2015, @01:53PM (#203756) Journal

    What if the person in the photograph do look like a gorilla?

    Break ties with the pose. The hip of a gorilla and the hip of a human lead to very different postures even if both are walking bipedally. If it looks like a gorilla and walks like a gorilla, it's a gorilla. If it looks like a human and walks like a human, it's a human. If it looks like a gorilla and walks like a human, it's a well-known hoax known as Bigfoot. In other words, a human.

    • (Score: 2) by physicsmajor on Wednesday July 01 2015, @04:56PM

      by physicsmajor (1471) on Wednesday July 01 2015, @04:56PM (#203827)

      Nice ideas that don't work at all with deep learning. These neutral nets have no human prior information other than labels on their training data. If you want it to learn to differentiate two things it's misclassifying, you have to feed it more correctly labeled training data on either side of the error.

    • (Score: 2) by vux984 on Wednesday July 01 2015, @07:19PM

      by vux984 (5045) on Wednesday July 01 2015, @07:19PM (#203915)

      Break ties with the pose.

      Not sure how that's going to work with shots that don't show completely show them; what do you do if the 'posture' isnt in the shot?
      Unless you mean, "if it looks like a selfie angle assume human"? That would work; or better still just tag such photos "mouth breather".