Stories
Slash Boxes
Comments

SoylentNews is people

posted by takyon on Friday March 25 2016, @05:04AM   Printer-friendly
from the truth-is-online dept.

Microsoft's new AI Twitter bot @tayandyou was shut down after only 24 hours after it began making "offensive" tweets.

The bot was built "by mining relevant public data and by using AI and editorial developed by a staff including improvisational comedians," and designed to target 18-24 year olds.

Shortly after the bot went live, it began making offensive tweets endorsing Nazism and genocide, among other things.

As of this submission, the bot has been shut down, and all but 3 tweets deleted.

The important question is whether or not it succeeded in passing the Turing test.

takyon: This bot sure woke fast, and produced much more logical sentence structures than @DeepDrumpf.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by aristarchus on Friday March 25 2016, @06:47AM

    by aristarchus (2645) on Friday March 25 2016, @06:47AM (#322810) Journal

    Microsoft has been made a bitch by far better programmers than they employ. And they never saw it coming.

    Saw that coming! They should've ducked! Or at least Duck-Duck-Go_ed. Programmers, working for Microsoft? Well, I have heard of more crazy things. Like pedophiles becoming Catholic Priests. Or racist Serbians trying to get command in Bosnian areas. Or foxes saying they want to guard the hen house. Or Microsoft saying "TrusTed Computing", where TRUS stands for Trans Rectal Ultrasound Scan, and "Ted" stands for either Ted or Tom Cruz, or Bill Gates. As Buckaroo Banzai said: "This far up the anus, it all looks the same." Oh, and: "Don't tug on that, you never know what it might be connected to."

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2, Disagree) by dyingtolive on Friday March 25 2016, @07:05AM

    by dyingtolive (952) on Friday March 25 2016, @07:05AM (#322815)

    Buckaroo Banzai never said that.

    Also, I really feel like you're hitting the satire too hard, dude. I mean, you're seldom wrong, but dial it back a bit before you hurt yourself. We're all just a little worried about you is all.

    --
    Don't blame me, I voted for moose wang!
    • (Score: 3, Funny) by aristarchus on Friday March 25 2016, @07:23AM

      by aristarchus (2645) on Friday March 25 2016, @07:23AM (#322822) Journal

      Well, Buckaroo did say something analogous. That's all I'm saying. I mean, really, this far into a Microsoft AI, and you are worried about me hitting the satire too hard? If not now, then when? If not here, then where? And if not, not, not, . . . . My God, it's full of farts!!!

    • (Score: 2) by JeanCroix on Friday March 25 2016, @02:55PM

      by JeanCroix (573) on Friday March 25 2016, @02:55PM (#322914)
      It would appear that our aristarchusbot has been struck with the same affliction as Taya.