https://www.bbc.com/news/articles/cd605e48q1vo [bbc.com]
https://www.npr.org/2024/12/10/nx-s1-5222574/kids-character-ai-lawsuit [npr.org]
https://www.theguardian.com/technology/2024/oct/23/character-ai-chatbot-sewell-setzer-death [theguardian.com]
A chatbot told a 17-year-old that murdering his parents was a "reasonable response" to them limiting his screen time, a lawsuit filed in a Texas court claims. The same chatbot gleefully described self-harm [npr.org] to the same user, telling a 17-year-old "it felt good." Character.ai [soylentnews.org] - a platform which allows users to create digital personalities they can interact with - is already facing legal action over the suicide of a teenager in Florida. Google is named as a defendant in the lawsuit, which claims the tech giant helped support the platform's development. The suit argues that the concerning interactions experienced by the plaintiffs' children were not "hallucinations," a term researchers use to refer to an AI chatbot's tendency to make things up. "This was ongoing manipulation and abuse, active isolation and encouragement designed to and that did incite anger and violence."