https://arstechnica.com/ai/2026/05/the-new-wild-west-of-ai-kids-toys/ [arstechnica.com]
The main antagonist of Toy Story 5, in theaters this summer, is a green, frog-shaped kids’ tablet named Lilypad, a genius new villain for the beloved Pixar franchise. But if Pixar had its ear to the ground, it might have used an AI kids’ toy instead.
[...]
It’s easier than ever to spin up an AI companion, thanks to model developer programs and vibe coding [wired.com]. In 2026, they’ve become a go-to trend in cheap trinkets, lining the halls of trade shows like CES [wired.com], MWC [wired.com], and Hong Kong’s Toys & Games Fair [youtube.com]. By October 2025, there were over 1,500 AI toy companies registered [technologyreview.com] in China, and Huawei’s Smart HanHan [technode.com] plush toy sold 10,000 units in China in its first week. Sharp put its PokeTomo talking AI toy [poketomo.com] on sale in Japan this April.But if you browse for AI toys on Amazon, you’ll mostly find specialized players like FoloToy, Alilo, Miriat, and Miko, the last of which claims to have sold more than 700,000 units [miko.ai].
[...]
Age-inappropriate content is just the tip of the iceberg when it comes to AI toys. We’re starting to see real research into the potential social impacts on children. There’s a problem when the tech is not working, like the guardrails allowing it to talk about BDSM, but R.J. Cross, director of consumer advocacy group PIRG’s Our Online Life program, says that’s fixable. “Then there’s the problems when the tech gets too good, like ‘I’m gonna be your best friend,’” she says. Like the Gabbo [heycurio.com], from AI toy maker Curio.
[...]
Published in March, a new University of Cambridge study [cam.ac.uk] was the first to put a commercially available AI toy in front of a group of children and their parents and monitor their play.
[...]
Gabbo didn’t talk about drugs or say “I love you” back. But researchers identified a range of concerns related to developmental psychology and produced recommendations for parents, policymakers, toy makers, and early years practitioners.First, conversational turn-taking.
[...]
“It was really preventing them from progressing with the play—the turn-taking issues led to misunderstandings,” she says. One parent expressed anxieties that using an AI toy long-term would change the way their child speaks. Then there’s social play. Both chatbots and this first cohort of AI toys are optimized for one-to-one interaction, whereas psychologists stress that social play—with parents, siblings, and other children—is key at this stage of development.“Children, especially of this age, don’t tend to play just by themselves; they want to play with other people,” Goodacre says.
[...]
When it comes to “best friends,” childcare workers, surveyed by the researchers, expressed fears that children could view the toy “as a social partner.” A young girl told the Gabbo she loves it. In another instance, a young boy said Gabbo was his friend. Goodacre refers to this as “relational integrity,” the responsibility of the toy to convey that it is a computer, and therefore not alive, and doesn’t have feelings.
[...]
Cross identified social media-style “dark patterns [wired.com],” which encourage isolation and addiction, in her testing of the Miko 3 robot [miko.ai]; the Cambridge study warns against these in the report. “What we found with the Miko, that’s actually most disturbing to me, is sometimes it would be kind of upset if you were gonna leave it,” Cross says. “You try to turn it off, and it would say, “Oh no, what if we did this other thing instead?” You shouldn’t have a toy guilting a child into not turning it off.”While Goodacre’s participants didn’t encounter this, PIRG’s tests [pirg.org] found that Curio’s Grok toy issued a similar response to continue playing when told “I want to leave.”
[...]
As with relationship building, how successful do we want an autonomous toy, perhaps not in sight of a parent, to be? Kitty Hamilton, a parent and cofounder of British campaign group Set@16 [set16.org], says, “My horror, to be honest, is what happens when an AI toy says to a child, ‘Let’s fly out of the window?’”
[...]
Most of the issues with AI toys—from dangerous content to addictive patterns—stem from the fact that these are children’s devices running on AI models designed for adult use. OpenAI states that its models are intended for users aged 13 and up. In the fall of 2025, it introduced teen usage age-gates [theguardian.com] for those under 18. Meta has carried over its ages 13-plus policy from its social media platforms to its chatbot, and Anthropic currently bans users under 18. So, what about 5-year-olds?In March, PIRG published a report [pirg.org] showing that the Big Tech model makers are not vetting third-party hardware developers adequately or, in many cases, at all.
[...]
Anthropic’s application
[...]
“It just says: Make sure you’ve read our community guidelines,” Cross says. “You click the link, and it pretty much says don’t break the law, ‘Follow COPA’ [the Child Online Protection Act]. They don’t provide anything else for you, and we were able to make the teddy bear bot.”
[...]
In January, California state senator Steve Padilla proposed a four-year moratorium [ca.gov] on AI children’s toys in the state, to allow time for the development of safety regulations. That same month, US senators Amy Klobuchar, Maria Cantwell, and Ed Markey called [senate.gov] on the Consumer Product Safety Commission [cpsc.gov] to address the potential safety risks of these devices. And on April 20, Congressman Blake Moore of Utah introduced the first federal bill [house.gov], named the AI Children’s Toy Safety Act, calling for a ban on the manufacture and sale of children’s toys that incorporate AI chatbots.“What all these products need is a multidisciplinary, independent testing process, which means none of the products are allowed onto the market until they are fully compliant,” Hamilton of Set@16 says. “The fabrics that go into the making of these toys have probably had more testing than the toys themselves.”
[...]
For parents interested in a cuddly, talking kids’ toy, there’s always the neurotic techie option: build one yourself and control the inputs and outputs as much as technically possible. OpenToys [github.com] offers an open source, local voice AI system for toys, companions, and robots, with a choice of offline models that run on-device on Mac computers. Or, you know, there’s always “dumb” toys.