Google Photo tries to categorize your pictures automatically. Until very recently, it had a failure mode in which its classification for some pictures of humans was "Gorillas".
Google reacted [and apologised] very quickly when they got a complaint from a black woman who had been misclassified.
When Brooklyn-based computer programmer Jacky Alcine looked over a set of images that he had uploaded to Google Photos on Sunday, he found that the service had attempted to classify them according to their contents. Google offers this capability as a selling point of its service, boasting that it lets you, “Search by what you remember about a photo, no description needed.” In Alcine’s case, many of those labels were basically accurate: A photograph of an airplane wing had been filed under “Airplanes,” one of two tall buildings under “Skyscrapers,” and so on.
Then there was a picture of Alcine and a friend. They’re both black. And Google had labeled the photo “Gorillas.” On investigation, Alcine found that many more photographs of the pair—and nothing else—had been placed under this literally dehumanizing rubric.
Speculating, it's possible that their software is heavy on statistical matching and it's really hard to debug, which is why they wound up simply deleting "Gorilla" from the list of possible categories.
(Score: 0) by Anonymous Coward on Wednesday July 01 2015, @02:24PM
This is the result of Social Justice. Everything is horrible offensive, and everything must have a hate campaign against it.
(Score: 0) by Anonymous Coward on Wednesday July 01 2015, @04:20PM
Yup. The "Dukes of Hazard General Lee" Hot Wheels car is now offensive and banned, but now selling for up to $300 on eBay.
(Score: 4, Informative) by eof on Wednesday July 01 2015, @05:02PM
Did you read the article? The guy pointed it out and went on with his life.