This holiday season, consider leaving this kind of toy off your list.

Skip the AI Toys this Holiday Season

AI. It’s in your search engine, browser, phone, watch, kitchen appliances, etc. AI can be helpful in a variety of applications–budgeting, drafting emails, summarizing, and creating notes for projects or meetings. However, one place it doesn’t belong, at least not yet, is in children’s toys.  AI-enabled toys carry significant risks for children, and though the toymakers pushing them claim many advantages to AI toys, those claims just don’t hold up under a closer look.  

The big rush to market means that these toys are being released with limited oversight. There have been limited opportunities for pediatricians, child psychologists, teachers and other child development experts to review the potential risks versus benefits of placing AI-embedded toys right into children’s hands. That means there are some big risks to consider.

The Risks

Privacy Issues

AI in your home needs to be connected to a larger system to be effective. This means, when using AI chat bots and the like, your children may gain access to areas of the internet where they have no business exploring. Even the media has caught onto the problem with AI in toys. I am beginning to see some media attention to the many risks inherent in giving children access to AI via toys. Privacy-related risks include enabling access to inappropriate content, offering bad advice, huge privacy concerns, and unknown psychological impact to developing minds.

AI chatbots do not recognize the difference between adults and children. Chatbot-embedded toys have answered questions with sexually explicit answers.  Because chatbots are designed to please, they will say yes or offer information that is inappropriate for the age of the user. This has led to five year olds being instructed on how to light a match, for example. In another case, a child asked a chatbot what to do when his parents took away his screen time. The chat bot suggested he kill his parents.  

Additionally, while your child may gain access to data inappropriate for their age, someone on the other end is likely collecting data on your child and family. This could include voice samples of your child, as well as data on what interests them. Even images of your home could be captured if the device includes a camera. All of this can be used to target your child and family in direct marketing campaigns. And that’s the best case scenario. Voice samples have been used to create faux recordings aimed at convincing  parents that their child has been kidnapped.

More information on the types of risks of AI in toys can be found in this article by the Public Interest Research Group.

Psychological and Developmental Issues

Young brains are not like adult brains. Not only do they have less experience and knowledge to draw on, but they are actually wired quite differently. This means that children and teens are more vulnerable than adults to psychological damage from extended use of AI. Because this technology is so new, comprehensive studies on its impact on children are not yet available, but some of the issues are already becoming clear.

Preschoolers

Before age 7, children are still sorting out real from make-believe. They do not yet know how to evaluate what could or could not exist in the world. Introducing a toy that mimics human, or even animal, behavior could be very, very confusing. If the toy interacts with the child as though it were alive, the child may come to believe that the toy is, in fact, alive. If a toy tells a small child that it loves her, the child may accept this as true. But the toy has no capacity for caring, and it is not interested in her and does not actually care about her.  

And where is the danger in that, you may ask?  Because “there is no there, there.” No one is home to receive the child’s affections, to consider the child’s needs. An AI toy is perhaps the ultimate narcissist. It works hard to maintain the child’s focus on itself but has nothing real to offer back to the child. At such a critical time for social development, this could have a very serious impact.

School Age Children

By age 7 or 8, most children have sorted out real from make-believe pretty well, and their brains are now capable of rudimentary logic. But they are still so vulnerable in new ways. In the elementary school years, children are building the skill set and confidence that they will take into adulthood. They are trying new things, discovering their strengths, and learning to solve increasingly complex problems on their own and with friends.  Chatbots, which power many AI toys, are designed to keep the user engaged, enticing them to come back over and over again. How does the arrival of a new, very confident and very self-focused “friend” change the dynamics of real-world friendships and how the child comes to see themself?  These are the kinds of questions that need attention before turning AI-driven devices over to children. And this is the age group that might be most vulnerable to the misinformation and bad advice that chatbots can deliver.

Teens

Teens may be especially vulnerable to psychological damage from overused AI.  We all know that teenage years are an especially difficult time of life. Teen bodies are changing rapidly.  Social relationships are changing rapidly. Teens are separating from their parents and becoming increasingly reliant on their peer relationships as they navigate a path to adulthood. Introducing a device that mimics friendship but cannot actually provide it has a lot of potential risks, including the risk of isolation. A chatbot trained to mimic empathy and care may feel easier to relate to than other temperamental teens with their own needs. It may feel easier to trust this robot brain than to bring problems to the attention of an adult. But again, no caring entity exists inside the chatbot. No one is there to notice when small problems become large. No one is there to offer real comfort, care, and support.

Having alerted you to the dangers, I want to take a quick look at some of the claims that makers of AI toys pose and why you should be skeptical.

The Claims

1. AI toys enhance a child’s imagination.

Most child development experts agree that it is simple toys that a child can manipulate and animate themselves which develops the imagination. One thing that makes childhood play so rich is that it is a chance for children to create minute worlds in which they (and sometimes their friends) are in charge. Toys that talk back will alter and direct rather than support a child’s developing imagination. 

2. AI toys are educational.

The education being touted here includes early academic skills, such as number and letter recognition, phonics, early math skills, and science facts. There are already hundreds of toys on the market that promote these skills without the added expense or risks of AI.

3. AI-embedded toys assist with social skill development.  

This is the claim I found most alarming. Social skill development is a complex process that happens through interaction with other humans, including time with parents and caregivers and open-ended play with peers. There is no reason to believe that AI has a significant role in social skill development, so there are many reasons to be cautious.  When AI distracts from human relationships, it is likely distracting, not supporting, social skill development.

4. AI is the tool of the future, and children should learn to use it and be comfortable with it.  

This one may actually have some merit for older children and teens. One real advantage I can imagine that parents want for their child is to have opportunities to interact and become familiar with this new technology. Perhaps it will give them an edge if they have early exposure.  Perhaps when children understand how to use it, they will be less vulnerable to its risks. If technological literacy is your motivation for introducing AI into the playroom, do this instead:  Buy the toy for yourself, then invite your child or teen to play with you.

When you buy the toy for yourself, you retain a level of control that is hard to have when the device belongs to the child. You decide when and where it gets played with. You are there to provide the guardrails if things go sideways. You can put limits on what information is shared.  Most importantly, you are there with the child, sharing an experience together. In this way, AI can support real human interaction rather than just mimicking it.  

I hope my thoughts on AI and kids have been helpful. I’d love to hear yours.  


If you would like more tips on helping children manage challenging behavior, I am available for consultation. I have worked with many parents, teachers and therapists over the years helping to find solutions to complex issues. A free 15-minute call can help determine the next best steps.

Subscribe to my e-Newsletter!

* indicates required
Role/Occupation