At Venture Beat:
From Microsoft's accidentally racist bot to Inspirobot's dark memes, AI often wanders into transgressive territories. Why does this happen, and can we stop it?
Ispirobot seems very interesting.
Another example of AI gone awry is Inspirobot. Created by Norwegian artist and coder Peder Jørgensen, the inspirational quote-generating AI creates some memes that would be incredibly bleak if the source weren't a robot. News publications called it an AI in crisis or claimed the bot had "gone crazy." Inspirobot's transgression differs from Tay's, though, because of its humor. Its deviance serves as entertainment in a world that has a low tolerance of impropriety from people, who should know better.
What the bot became was not the creator's intention by a long shot. Jørgensen thinks the cause lies in the bot's algorithmic core. "It is a search system that compiles the conversations and ideas of people online, analyzes them, and reshapes them into the inspirational counterpoints it deems suitable," he explained. "Given the current state of the internet, we fear that the bot's mood will only get worse with time."
The creators' attempts to moderate "its lean towards cruelty and controversy" so far have only seemed "to make it more advanced and more nihilistic."
(Score: 4, Insightful) by Unixnut on Thursday January 04 2018, @10:32AM
> Which is why Asimov's laws of robotics are laughable in practice.
Of course they are. The way I read Asimov's stories, the whole purpose of him defining "the three laws of robotics" was to show how those laws could be bent or broken, and that a true intelligence (like us humans) would find ways to skirt laws just like humans do.
Indeed the more advanced and intelligent the robots in Asimov's stories got, the more they could find ways to bypass the laws, ending in one story, where basically the robots had a philosophical debate over the laws intent (rather than just directly applying them as computers would do), and how their actions, while violating one of the laws directly (killing a human), indirectly wasn't a violation.