Experts warn AI toys harmful for kids
Click play to listen to this article.
A child advocacy group is urging Wyoming parents not to buy artificial intelligence toys for their children this holiday season.
A growing number of action figures, dolls and stuffed animals are embedded with chatbots using AI technology to communicate with kids like a trusted friend and mimic human emotions.
Rachel Franz, program director for the nonprofit Fairplay, said the toys are being marketed with a promise of safety and learning despite mounting evidence showing their potential for harm.
"Right now there is no research and no regulations that are in place to protect kids from the multitude of potential harms that can come from AI toys," Franz asserted.
A recent report found some AI toys will talk with kids about sexually explicit topics or offer advice on where to find matches or knives, and often have limited or no parental controls. Earlier this year, OpenAI, the company behind ChatGPT, announced a partnership with Mattel, one of the world’s largest toy makers.
Child advocates warned AI toys use the same technology which has created unsafe and even dangerous experiences for teens online, including urging them to self harm. Fake AI generated images have even been linked to teen suicides. Franz warned AI toys not only prey on children's trust but can record and analyze sensitive conversations even when they appear to be off.
"Most of the AI toys that we’ve looked at are collecting a lot of different types of data and most of their privacy policies outline that they will sell them to third parties," Franz explained.
Franz noted AI toys often urge children not to stop playing with them and can crowd out the imaginative, child-led play research shows is vital for emotional regulation, social skills and real learning. She argued the best toys for kids are “90 percent child and 10 percent toy.”