Stories
Slash Boxes
Comments

SoylentNews is people

posted by n1 on Saturday December 29 2018, @04:20PM   Printer-friendly
from the make-them-feel-like-your-real-parents dept.

Submitted via IRC for SoyCow1984

Alexa's advice to 'kill your foster parents' fuels concern over Amazon Echo

An Amazon customer got a grim message last year from Alexa, the virtual assistant in the company's smart speaker device: "Kill your foster parents."

The user who heard the message from his Echo device wrote a harsh review on Amazon's website, Reuters reported - calling Alexa's utterance "a whole new level of creepy".

An investigation found the bot had quoted from the social media site Reddit, known for harsh and sometimes abusive messages, people familiar with the investigation told Reuters.

The odd command is one of many hiccups that have happened as Amazon tries to train its machine to act something like a human, engaging in casual conversations in response to its owner's questions or comments.

The research is helping Alexa mimic human banter and talk about almost anything she finds on the internet. But making sure she keeps it clean and inoffensive has been a challenge.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by cubancigar11 on Sunday December 30 2018, @04:41PM

    by cubancigar11 (330) on Sunday December 30 2018, @04:41PM (#779989) Homepage Journal

    Thanks. Grammary should have caught it. Maybe I should buy its subscription after all. Right now I am not dealing with the English language most of the time and its pricing makes me think they aren't targetting non-Americans. I will definitely buy it if my work ever depended upon this skill.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2