United Nations: Siri and Alexa Are Encouraging Misogyny
We already knew humans could make biased AIs — but the United Nations says the reverse is true as well.
Millions of people talk to AI voice assistants, such as Apple's Siri and Amazon's Alexa. When those assistants talk back, they do so in female-sounding voices, and a new UN report argues that those voices and the words they're programmed to say amplify gender biases and encourage users to be sexist — but it's not too late to change course.
The report is the work of the United Nations Educational, Scientific, and Cultural Organization (UNESCO), and its title — "I'd blush if I could" — is the response Siri was programmed in 2011 to give if a user called her a "bitch."
[...] "It is a 'Me Too' moment," Saniye Gülser Corat, Director of UNESCO's Division for Gender Equality, told CBS News. "We have to make sure that the AI we produce and that we use does pay attention to gender equality."
Also at CNET.
[Back in 2013 in Germany, Siri's voice could be selected as either male or female.
Possibly one of the earliest and best-known "computer voices" was that of Majel Barrett from ST:TOS, although a case could be made for HAL 9000 from 2001: A Space Odyssey. --Ed.]
(Score: 4, Interesting) by choose another one on Monday May 27 2019, @08:09AM
> Fun fact: British Siri's name is Daniel, and he's
Yeah, but even when configured to be male "he" answers to "Siri" not "Daniel", and doesn't complain about it either, there appears to be no "boy named Sue" hang-up built in so "Siri" must have been intended to be gender-neutral from the start.
Which kind of busts the article's point really - the voice isn't "amplifying our gender biases" when our (the user) biases are defining the voice in the first place.