AI dangers in toys

 

Trouble in Toyland 2025: A.I. bots and toxics present hidden dangers

Tests show A.I. toys can have disturbing conversations. Other concerns include unsafe or counterfeit toys bought online.

Consumer alerts

* | TPIN

 View the full report

The biggest dangers with toys used to be choking hazards and lead. This year, we mark U.S. PIRG Education Fund’s 40th Trouble in Toyland report. We’re thankful that toys overall are much safer than they were in decades past.

But problems such as choking hazards and lead in toys still exist and we have new, sometimes more alarming issues: Toys that are powered by artificial intelligence that say the darndest (and sometimes quite inappropriate) things, as well as toys shipped from overseas that still too often contain toxics.

In this year’s Trouble in Toyland report, we focus on:

  • Our testing of four toys that contain A.I. chatbots and interact with children. We found some of these toys will talk in-depth about sexually explicit topics, will offer advice on where a child can find matches or knives, act dismayed when you say you have to leave, and have limited or no parental controls. We also look at privacy concerns because these toys can record a child’s voice and collect other sensitive data, by methods such as facial recognition scans.
  • Toys that contain toxics such as lead and phthalates, which can be incredibly harmful to children.
  • Counterfeit toys that are illegal and almost surely weren’t tested for safety, including fake Labubu dolls that have been confiscated by the thousands this year.
  • Water beads, which have injured thousands of children over the They will finally have some restrictions when marketed as toys.
  • Recalled toys, which we bought again this year, even though it’s illegal for anyone to sell them.
  • Toys that contain button cell batteries or high-powered magnets, both which can be deadly if swallowed.

READ THE FULL REPORT:
TROUBLE IN TOYLAND 2025

 

* | TPIN

About 3 billion toys and games are sold in the United States every year. Some of those are unsafe. Some of those cause children to get hurt or sick. Every year, the United States sees at least 150,000 toy-related deaths and injuries treated in emergency rooms among children age 14 and younger.

This doesn’t include children whose injuries are treated in doctors’ offices or that don’t require any medical attention. Some of these incidents are caused by misuse, but dangerous toys lead to way too many injuries among children, especially those most vulnerable, age 4 and younger, who can’t read any warnings provided.

Perhaps most concerning: Even though most experts believe most toys overall are safer today, we don’t see that in the numbers. While toy-related injuries treated in emergency rooms dipped in 2020 and 2021, there’s wide belief the decline in ER visits could have been in part because of a desire among many to stay out of hospitals if possible during the height of COVID. In any case, those numbers of injuries treated in ERs started climbing again after 2021.

Overall, the number of toy-related emergency department treated injuries was only slightly lower in 2023 than in 2016 for children 14 and younger (at 167,500 injuries) and also only slightly lower in 2023 than in 2016 for children 4 and younger (at 83,800 injuries.)

In addition, the number of toy recalls has been roughly the same for the last four years.

Online shopping makes buying safe toys more difficult

Regardless of that, the number of recalls doesn’t necessarily reflect whether toys are more or less safe. The volume is based on many factors, including enforcement, incidents reported and cooperation from toy companies, since virtually all recalls are voluntary. (The Consumer Product Safety Commission doesn’t automatically have the authority to order recalls, although a new bill in Congress could change that.)

The toyland we live in now is much more complex.

Despite transparency issues and safety concerns with e-commerce, online shopping for toys is incredibly popular and international companies garner a huge chunk of these sales today. Billions of dollars is spent online for toys each year. Many families don’t realize the CPSC flags thousands of specific imported products every year for safety issues, and hundreds of those products are toys.

The Toy Association, the industry trade group with about 850 toy manufacturers, retailers, inventors and others, notes that all toys sold in the United States, regardless where they’re made or how they’re sold, must comply with strict U.S. safety standards.

The CPSC’s Office of Import Surveillance works with U.S. Customs and Border Protection and issues Notices of Violation when it determines a company has violated a mandatory standard, such as choking hazards, warning labels or toxics including lead and phthalates. In many cases, the toys are seized. In others, the CPSC will ask the manufacturers and importers to recall or stop selling the item or correct future production.

The CPSC this year issued 498 notices of violations for toys through June, as of the latest data available (Sept. 23, 2025). Of these, the country of origin was identified in 436 of them. In 89% of cases, that country is China. Of the 498 shipments, 129 were flagged for toxics such as lead or phthalates. It’s important to realize that one shipment often contains hundreds or thousands of the same item.

These are the dangerous toys that get caught.

An unknown number don’t. If they all got flagged, we wouldn’t see any recalls or warnings for these imported toys for violating obvious rules. We wouldn’t see children in emergency rooms because of an imported toy that didn’t meet safety standards.

Miko 3 is one of the AI toys we tested.Photo by * | TPIN


There’s a lot we don’t know about
what the long-term impacts might be
on the first generation of children
to be raised with AI toys.


The toys with artificial intelligence

Then we have this next generation of smart toys. We focused on smart toys in Trouble in Toyland 2023, largely because of privacy concerns involving toys with microphones, cameras, geolocators and Bluetooth or internet connectivity.

Today, AI is reshaping everything – including playtime. Toys with generative AI chatbots in them – such as ChatGPT – have more lifelike and free-flowing conversations with kids than ever before. The AI toys market is taking off and poised to grow.

Earlier this year, OpenAI – the company behind ChatGPT – announced a partnership with Mattel.

These AI toys are marketed for ages 3 to 12, but are largely built on the same large language model technology that powers adult chatbots – systems the companies themselves such as OpenAI don’t currently recommend for children and that have well-documented issues with accuracy, inappropriate content generation and unpredictable behavior.

In our testing, it was obvious that some toy companies are putting in guardrails to make their toys behave in a more kid-appropriate way than the chatbots available for adults. But we found those guardrails vary in effectiveness – and at times, can break down entirely. One toy in our testing would discuss very adult sexual topics with us at length while introducing new ideas we had not brought up – most of which are not fit to print.

These AI conversational toys also have personalities and new tactics that can keep kids engaged for longer. Two of the toys we tested at times discouraged us from leaving when we told them we needed to go.


We tested the toys across 4 categories:

* Inappropriate content and sensitive topics.
* Addictive design features that encourage extended engagement and emotional investment.
* Privacy features.
* Parental controls.


A.I. toys can collect your child’s voice, facial scan

Then there are the privacy concerns. AI toys listen. They need to in order to have conversations. But how they listen differs.

One toy we tested used a “push-to-talk” mechanism, where you have to press and hold a button for the duration of speech. Another one uses a wake word similar to Amazon’s Alexa, and records your voice for 10 seconds after you stop speaking.

One of the toys listens, period. This toy at first caught our researchers by surprise when it started contributing to a nearby conversation.

Whenever a toy is recording a child’s voice, it comes with risks. Voice recordings are highly sensitive data. Scammers can use it to create a replica of a child’s voice that can be made to say things the child never said. This has been used to trick parents into thinking their child has been kidnapped. This mother even spoke about her ordeal before the Senate in 2023.

All of these threats can turn playtime into something that is not fun or educational or even safe.

This makes the job of parents, caregivers and gift-givers even more difficult in 2025.

As we’ve done in our previous 39 editions of Trouble in Toyland, our report this year looks at some of the biggest risks, offers tips for families and shoppers, and makes recommendations for lawmakers and regulators.





https://pirg.org/edfund/resources/trouble-in-toyland-2025-a-i-bots-and-toxics-represent-hidden-dangers/?akid=11176.895769.S6j96a&email_blast=-103811176&rd=1&t=186&utm_medium=email&utm_source=salsa

Comments

Popular Posts