Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Wednesday July 01 2015, @05:29AM   Printer-friendly
from the did-you-really-just-call-her-that? dept.

Google Photo tries to categorize your pictures automatically. Until very recently, it had a failure mode in which its classification for some pictures of humans was "Gorillas".

Google reacted [and apologised] very quickly when they got a complaint from a black woman who had been misclassified.

When Brooklyn-based computer programmer Jacky Alcine looked over a set of images that he had uploaded to Google Photos on Sunday, he found that the service had attempted to classify them according to their contents. Google offers this capability as a selling point of its service, boasting that it lets you, “Search by what you remember about a photo, no description needed.” In Alcine’s case, many of those labels were basically accurate: A photograph of an airplane wing had been filed under “Airplanes,” one of two tall buildings under “Skyscrapers,” and so on.

Then there was a picture of Alcine and a friend. They’re both black. And Google had labeled the photo “Gorillas.” On investigation, Alcine found that many more photographs of the pair—and nothing else—had been placed under this literally dehumanizing rubric.

Speculating, it's possible that their software is heavy on statistical matching and it's really hard to debug, which is why they wound up simply deleting "Gorilla" from the list of possible categories.

http://www.slate.com/blogs/future_tense/2015/06/30/google_s_image_recognition_software_returns_some_surprisingly_racist_results.html


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Informative) by Gravis on Wednesday July 01 2015, @06:44AM

    by Gravis (4596) on Wednesday July 01 2015, @06:44AM (#203646)

    image in question: https://pbs.twimg.com/media/CIoW7wBWoAEqQRP.png [twimg.com]
    actual gorilla: https://c1.staticflickr.com/7/6169/6151464249_1824cde43d_b.jpg [staticflickr.com]

    a very dark face with a wide nose and seemingly surrounded by black. is it really a surprise it was labeled as being gorillas?

    Starting Score:    1  point
    Moderation   +2  
       Informative=2, Total=2
    Extra 'Informative' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 0) by Anonymous Coward on Wednesday July 01 2015, @07:29AM

    by Anonymous Coward on Wednesday July 01 2015, @07:29AM (#203650)

    The human brain treats faces separately from general recognition tasks. Maybe Google should implement something similar to its algorithms. Have some "looks like a face" detector and divert anything that qualifies to an extra algorithm that is specialized in recognizing human faces.

    • (Score: 0) by Anonymous Coward on Wednesday July 01 2015, @07:34AM

      by Anonymous Coward on Wednesday July 01 2015, @07:34AM (#203652)

      Have some "looks like a face" detector and divert anything that qualifies to an extra algorithm that is specialized in recognizing human faeces.

      Same shit will happen.

    • (Score: 4, Informative) by linuxrocks123 on Wednesday July 01 2015, @07:44AM

      by linuxrocks123 (2557) on Wednesday July 01 2015, @07:44AM (#203654) Journal

      A face detector wouldn't be enough, because gorillas have faces. Looking at the photos, it's a selfie with a black girl's face taking up most of the picture; the guy is way in the background.

      The girl has long black hair on both sides of her face, and her body (which would have been helpful for disambiguation) isn't in the photo. I don't have much experience with image recognition software, but I think it's likely her hair combined with the photo being a closeup of just her face is probably what did it.

      Probably the ultimate fix will be changing the heuristics to err way, way on the side of labeling gorillas people, rather than the other way around. If a real gorilla photo ends up in the "people" category, the gorilla won't be offended.

      • (Score: 2) by linuxrocks123 on Wednesday July 01 2015, @07:47AM

        by linuxrocks123 (2557) on Wednesday July 01 2015, @07:47AM (#203655) Journal

        Oh, also, I would guess it's highly likely Google already has a face detector. That's one of the easiest ways to find people in a photo.

      • (Score: 0) by Anonymous Coward on Wednesday July 01 2015, @09:09AM

        by Anonymous Coward on Wednesday July 01 2015, @09:09AM (#203683)

        A face detector wouldn't be enough

        If you had read my comment instead of just skimming through it, you would have noticed that I didn't claim it would be. All the face detector would do it to identify the areas needing special attention. The real work would have to be in the specialized routines that are to be run on the areas of special attention.

        • (Score: 2) by linuxrocks123 on Wednesday July 01 2015, @09:21AM

          by linuxrocks123 (2557) on Wednesday July 01 2015, @09:21AM (#203688) Journal

          Apologies. That's a good idea: detect faces, and then spend extra effort trying to classify the species of the face if it's not obvious from other parts of the photo.

          Many photos wouldn't need the extra effort, even if there's a face, because you know from the wearing of clothing and/or shape of the body that it is or isn't a human.

          It would be nice if Google would post some information about their algorithms in general, how they went wrong in this case, and what they're doing to fix it.

      • (Score: 1) by esperto123 on Wednesday July 01 2015, @12:19PM

        by esperto123 (4303) on Wednesday July 01 2015, @12:19PM (#203731)

        I think having gorillas been classified as people would also make people lose their shit (no pun intended), although probably not as much.

        But it got me thinking, we can differentiate between gorillas and humans quite easily, even though we are VERY similar, what do we use to differentiate? face proportions? wrinkles? bone structure?

        • (Score: 2) by Freeman on Wednesday July 01 2015, @07:12PM

          by Freeman (732) on Wednesday July 01 2015, @07:12PM (#203908) Journal

          Intelligence.

          --
          Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
        • (Score: 2) by tangomargarine on Wednesday July 01 2015, @08:23PM

          by tangomargarine (667) on Wednesday July 01 2015, @08:23PM (#203952)

          I think having gorillas been classified as people would also make people lose their shit (no pun intended),

          What pun?

          --
          "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
  • (Score: 0) by Anonymous Coward on Wednesday July 01 2015, @04:07PM

    by Anonymous Coward on Wednesday July 01 2015, @04:07PM (#203809)

    Don't start throwing the racist card at this... But I've noticed some peoples faces do look like Gorillas, AND... I've heard young children say "mommy, look at the monkey" when seeing a black person. It was totally innocent, and no racism was implied.

    • (Score: 2, Touché) by albert on Wednesday July 01 2015, @05:16PM

      by albert (276) on Wednesday July 01 2015, @05:16PM (#203839)

      We learn to suppress such thoughts as we grow older. We learn there are things you can't say or even think.

      Other good ones involve obese and pregnant people being confused, especially when male. :-)

      • (Score: 0) by Anonymous Coward on Wednesday July 01 2015, @07:45PM

        by Anonymous Coward on Wednesday July 01 2015, @07:45PM (#203937)

        Yes. My point was... If innocent children can make that mistake, then so can a computer algorithm. I have yet to see a racist program, unless it was intentional. Good point about the mistaken pregnancy label, done it myself.

        • (Score: 0) by Anonymous Coward on Thursday July 02 2015, @02:39AM

          by Anonymous Coward on Thursday July 02 2015, @02:39AM (#204055)

          > I have yet to see a racist program, unless it was intentional.

          That is circular reasoning - software can't be racist because, by your definition, it is ignorant so therefore you've never seen an unintentionally racist program.

          And yet here we have an example of one.

      • (Score: 0) by Anonymous Coward on Thursday July 02 2015, @02:44AM

        by Anonymous Coward on Thursday July 02 2015, @02:44AM (#204057)

        > We learn to suppress such thoughts as we grow older. We learn there are things you can't say or even think.

        Maybe you learned that, but you learned the wrong lesson.

        What normal people learn is that being needlessly insulting is a shitty way to live your life.

  • (Score: 1) by jpkunst on Sunday July 05 2015, @11:56AM

    by jpkunst (2310) on Sunday July 05 2015, @11:56AM (#205266)

    Yes. Look at the lips. Black people have thick lips, gorillas (and other apes) have thin lips.