United Nations: Siri and Alexa Are Encouraging Misogyny
We already knew humans could make biased AIs — but the United Nations says the reverse is true as well.
Millions of people talk to AI voice assistants, such as Apple's Siri and Amazon's Alexa. When those assistants talk back, they do so in female-sounding voices, and a new UN report argues that those voices and the words they're programmed to say amplify gender biases and encourage users to be sexist — but it's not too late to change course.
The report is the work of the United Nations Educational, Scientific, and Cultural Organization (UNESCO), and its title — "I'd blush if I could" — is the response Siri was programmed in 2011 to give if a user called her a "bitch."
[...] "It is a 'Me Too' moment," Saniye Gülser Corat, Director of UNESCO's Division for Gender Equality, told CBS News. "We have to make sure that the AI we produce and that we use does pay attention to gender equality."
Also at CNET.
[Back in 2013 in Germany, Siri's voice could be selected as either male or female.
Possibly one of the earliest and best-known "computer voices" was that of Majel Barrett from ST:TOS, although a case could be made for HAL 9000 from 2001: A Space Odyssey. --Ed.]
(Score: 1, Insightful) by Anonymous Coward on Sunday May 26 2019, @07:45PM (1 child)
Do you get off on abusing your assistant or something? If the thing is pissing you off so bad, it's time to stop expecting bleeding-edge AI to understand your drunken slurs and turn on your own fucking light switch.
I'm not at all surprised by the people here that can't understand the article and insist it's SJW garbage. They've all got their head stuck so far up their asses they can't even see their own obvious misogyny. What hope do they have of recognizing the same thing when it takes a bit more analysis?
(Score: 0) by Anonymous Coward on Sunday May 26 2019, @08:22PM
Misandry is a thing these days. Why do you hate men? You've obviously got your own head so far up your ass, you can't see your own obvious misandry.