Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Wednesday July 01 2015, @05:29AM   Printer-friendly
from the did-you-really-just-call-her-that? dept.

Google Photo tries to categorize your pictures automatically. Until very recently, it had a failure mode in which its classification for some pictures of humans was "Gorillas".

Google reacted [and apologised] very quickly when they got a complaint from a black woman who had been misclassified.

When Brooklyn-based computer programmer Jacky Alcine looked over a set of images that he had uploaded to Google Photos on Sunday, he found that the service had attempted to classify them according to their contents. Google offers this capability as a selling point of its service, boasting that it lets you, “Search by what you remember about a photo, no description needed.” In Alcine’s case, many of those labels were basically accurate: A photograph of an airplane wing had been filed under “Airplanes,” one of two tall buildings under “Skyscrapers,” and so on.

Then there was a picture of Alcine and a friend. They’re both black. And Google had labeled the photo “Gorillas.” On investigation, Alcine found that many more photographs of the pair—and nothing else—had been placed under this literally dehumanizing rubric.

Speculating, it's possible that their software is heavy on statistical matching and it's really hard to debug, which is why they wound up simply deleting "Gorilla" from the list of possible categories.

http://www.slate.com/blogs/future_tense/2015/06/30/google_s_image_recognition_software_returns_some_surprisingly_racist_results.html


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2, Touché) by albert on Wednesday July 01 2015, @05:16PM

    by albert (276) on Wednesday July 01 2015, @05:16PM (#203839)

    We learn to suppress such thoughts as we grow older. We learn there are things you can't say or even think.

    Other good ones involve obese and pregnant people being confused, especially when male. :-)

    Starting Score:    1  point
    Moderation   +1  
       Touché=1, Total=1
    Extra 'Touché' Modifier   0  

    Total Score:   2  
  • (Score: 0) by Anonymous Coward on Wednesday July 01 2015, @07:45PM

    by Anonymous Coward on Wednesday July 01 2015, @07:45PM (#203937)

    Yes. My point was... If innocent children can make that mistake, then so can a computer algorithm. I have yet to see a racist program, unless it was intentional. Good point about the mistaken pregnancy label, done it myself.

    • (Score: 0) by Anonymous Coward on Thursday July 02 2015, @02:39AM

      by Anonymous Coward on Thursday July 02 2015, @02:39AM (#204055)

      > I have yet to see a racist program, unless it was intentional.

      That is circular reasoning - software can't be racist because, by your definition, it is ignorant so therefore you've never seen an unintentionally racist program.

      And yet here we have an example of one.

  • (Score: 0) by Anonymous Coward on Thursday July 02 2015, @02:44AM

    by Anonymous Coward on Thursday July 02 2015, @02:44AM (#204057)

    > We learn to suppress such thoughts as we grow older. We learn there are things you can't say or even think.

    Maybe you learned that, but you learned the wrong lesson.

    What normal people learn is that being needlessly insulting is a shitty way to live your life.