Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Sunday May 26 2019, @05:25PM   Printer-friendly
from the voices-carry dept.

United Nations: Siri and Alexa Are Encouraging Misogyny

We already knew humans could make biased AIs — but the United Nations says the reverse is true as well.

Millions of people talk to AI voice assistants, such as Apple's Siri and Amazon's Alexa. When those assistants talk back, they do so in female-sounding voices, and a new UN report argues that those voices and the words they're programmed to say amplify gender biases and encourage users to be sexist — but it's not too late to change course.

The report is the work of the United Nations Educational, Scientific, and Cultural Organization (UNESCO), and its title — "I'd blush if I could" — is the response Siri was programmed in 2011 to give if a user called her a "bitch."

[...] "It is a 'Me Too' moment," Saniye Gülser Corat, Director of UNESCO's Division for Gender Equality, told CBS News. "We have to make sure that the AI we produce and that we use does pay attention to gender equality."

Also at CNET.

[Back in 2013 in Germany, Siri's voice could be selected as either male or female.

Possibly one of the earliest and best-known "computer voices" was that of Majel Barrett from ST:TOS, although a case could be made for HAL 9000 from 2001: A Space Odyssey. --Ed.]


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by meustrus on Sunday May 26 2019, @07:50PM (3 children)

    by meustrus (4961) on Sunday May 26 2019, @07:50PM (#847962)

    Just think of a male butler, for example.

    Fun fact: British Siri's name is Daniel, and he's male [telegraph.co.uk]. Supposedly the British are more comfortable with a male personal assistant because of butlers.

    --
    If there isn't at least one reference or primary source, it's not +1 Informative. Maybe the underused +1 Interesting?
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 1, Funny) by Anonymous Coward on Sunday May 26 2019, @10:27PM (1 child)

    by Anonymous Coward on Sunday May 26 2019, @10:27PM (#848012)

    Thanks God it's not Jenkins.

    • (Score: 0) by Anonymous Coward on Sunday May 26 2019, @11:23PM

      by Anonymous Coward on Sunday May 26 2019, @11:23PM (#848031)

      Should be Jeeves, but they certainly couldn't live up to it.

  • (Score: 4, Interesting) by choose another one on Monday May 27 2019, @08:09AM

    by choose another one (515) Subscriber Badge on Monday May 27 2019, @08:09AM (#848132)

    > Fun fact: British Siri's name is Daniel, and he's

    Yeah, but even when configured to be male "he" answers to "Siri" not "Daniel", and doesn't complain about it either, there appears to be no "boy named Sue" hang-up built in so "Siri" must have been intended to be gender-neutral from the start.

    Which kind of busts the article's point really - the voice isn't "amplifying our gender biases" when our (the user) biases are defining the voice in the first place.