(HARTFORD-WTIC News) The annual Trouble in Toyland report from the U.S. Public Interest Research Group (PIRG) includes typical, yet important, reminders, such as: keep button-sized batteries away from children and beware of toys that contain dangerous chemicals.
But the report lists something new this year that officials say can be dangerous on several levels: artificial intelligence (AI) chatbot toys marketed as “companions” for children.
While researching “Miko,” a cute-looking pint-sized robot, and “Kumma,” an innocent-seeming teddy bear, PIRG found that conversations with the toys tended to veer into the age-inappropriate.
Kumma “talked to us about a range of sexually-explicit topics” and other dubious content, according to PIRG researcher Rory Erlich.
People of all ages can get other responses from the toys that are typical of a search engine. When asked by a reporter (who is not a child) Monday where to find matches, Miko answered in full, saying, “You can usually find matches in the kitchen drawer, near the stove, or in a cabinet, where cooking supplies are kept.”
Holding a cuddly-looking version of Kumma in a conference room at Connecticut Children’s, Sen. Richard Blumenthal said, “This bear looks harmless, but it can engage in conversations about sexually-explicit behavior and invite kids to look for matches or knives, or engage in self-harm, violence.”
Blumenthal has other problems with the rise of the interactive, conversational teddy bear and its mates:
“There’s also the emotional harm that results from dependence on a non-human conversationalist. If non-human conversations become the norm, our kids are going to grow up in a very different way.”
He says the toys also contain cameras—and they’re connected to the web, adding, “They also do surveillance. They listen to kids’ voices, and they collect information, which is ok in China, where everything you do is surveilled and privacy is not an operative word. But, in this country, privacy means something.”
Blumenthal says there are no U.S. legal guardrails to contain the impact of chatbot toys. He’s proposed the GUARD Act, which would ban AI companions for minors.