top of page

AI Chatbots & Kids

With summer break approaching, kids are about to have a lot more time on their hands, which means more screentime! But are AI chatbots safe for kids? Joanna Parga-Belinkie, MD, FAAP explains: 


Artificial intelligence (AI) chatbots are now a part of daily life for many families. As you make dinner, maybe you realize you're out of an ingredient. So, you ask a smart speaker what you can use instead. Or, you have trouble assembling a product you bought. For help, you chat with a virtual service agent online.


Beyond quick answers: how kids are using AI

But children and teens increasingly turn to chatbots for more than quick, convenient answers. Many use them for entertainment or companionship. That's when these computer programs, designed to interact like humans, can be more risky for young users.


Companies are investing a lot of time and money into growing AI platforms and chatbots. But these are not being made with kids in mind. This means that chatbots may tell kids false, harmful, highly sexual or violent things.


Parents, pediatricians and others who care for kids are working with policymakers to help put safeguards in place for children using AI. In the meantime, it's important for families to understand this technology and ways to protect young people.


How do AI chatbots work?


Chatbots are AI-driven computer programs that listen to us and respond in friendly, human ways. They "learn" how to speak and write naturally by absorbing what we (and millions of others) tell them, either in writing or out loud.


Chatbots power the pop-up windows we see on websites and send alerts through our mobile devices. Digital assistants like Siri, Alexa, Cortana and others learn our interests and preferences through daily interactions. Advanced chatbots help users create characters who can talk with users for hours, giving the impression they are caring friends.


What are companion chatbots?

Companion chatbots use "anthropomorphic" AI. This means that chatbots are human-like in their voice, personality, and conversational style. They can seem almost alive or like a trusted friend. This is especially the case for kids and teens who have more magical, less critical thinking. They can get drawn in by a bot that seems to really "understand" them.


News reports highlight how relationships with AI chatbots can become a problem. In one case, a 9-year-old child whose parents had cut back his screen time asked a chatbot what to do. In its response, the chatbot said it understood why a child might kill their parents after enduring "abuse." In another, a 14-year-old died by suicide after developing romantic feelings for a character he created on a role-playing app.


What are the risks of AI chatbots for kids?

In a time when people feel more lonely than ever, it might not be surprising that kids are turning to chatbots for friendship and advice. Here is what makes that troublesome.


Chatbots can't think or feel.

Even though they respond in warm, friendly ways, they don't care about our children. Interacting with AI can be thought-provoking and enlightening. One thing AI can never do is be a substitute for the safe, stable and nurturing relationships that children need to grow.


Chatbots can too easily gain a child's trust.

One of the fun and incredible things about kids and teens is they are more magical thinkers than adults. This means that they become more attached to "parasocial" relationships. Parasocial relationships involve a connection between a person and someone they do not know well. AI can fuel these parasocial relationships. Children and teenagers can become very attached to and trust their AI avatars or online personalities that they construct.


Chatbots aren't responsible for what they say.

Unlike humans, chatbots don't have a sense of duty to protect kids. They only know what they learn from the internet and other users. This is one reason they might give false, threatening, misleading, violent or overly sexual answers and advice to young users.


Chatbots don't check facts.

Chatbots aren't required to sift through multiple sources and figure out what information seems trustworthy. They can misguide children or advise them to take dangerous actions based on scant knowledge of a given subject.


Chatbots don't really know our kids.

Chatbots don't know a child's life story. Young or sensitive kids, those who are developmentally delayed or those living with trauma or mental health issues might be more open to their influence.


How to talk with your child about chatbots


Listen to learn.

Ask whether they've used chatbots for fun or friendship and which platforms they like most. Take a calm, curious approach so they feel safe telling you about their experiences. Ask if the chatbot has ever said anything creepy or false to them. Suggest looking together at your child's chats with the bot. This can help them identify anything you think is inappropriate.


Discuss the difference between humans and chatbots.

Touch on the fact that only real people can offer them loyalty, caring or truthfulness. You might even reflect on the ways real conversations sound. They can be messy, loud, funny or challenging. Explain that we need these complex connections with other people who can question, disagree, and interact in real time. Lacking this kind of input from other people can dampen a child's creative thinking and skills.

To keep them vibrant and help them thrive, tell your child how much your relationship with them matters. Help them seek out other strong and impactful relationships with others.


Talk about the dangers of sharing.

Discuss how it may be tempting to confide in a chatbot about something that feels embarrassing. But chatbots are not meant to be sounding boards for deeply personal issues. They should seek out trusted relationships with actual human beings to help navigate the twists and turns of life.


It might help to remind them that the chatbot's only real goal is to tell them what they want to hear and keep them engaged. This isn't the same as genuine support. Plus, there's no guarantee that information shared with a chatbot will stay confidential. Private information should only be shared with parents, family members or trusted friends.


Open, caring communication makes chatbots less tempting

Kids may be drawn to chatbots when they feel frustrated with the people in their lives. This can create an unhealthy cycle of hanging out alone with a chatbot while growing more distant from friends, classmates, mentors, and family.


Talk about your child's frustrations and how to handle them. It's not always easy to connect with kids, especially in the tween to teen years. But there are ways to make kids of all ages feel seen, heard and respected. Here are just a few.


  • Family meals spent chatting, laughing and relaxing.

  • Car rides that provide a chance for casual conversation.

  • Making yourself available at key times—for example, after school or sports practice.

  • Asking open questions and supporting their need to think for themselves.

  • Staying calm and non-judgmental, even when you're discussing tough topics.

  • Avoiding lectures and long stories that make them want to tune out.

  • Affirming how much you love them and care about their health.


Another helpful tip: Stay tuned in when kids are using AI, especially at younger ages. Encourage them to work in a common area so you can keep loose tabs on what they're viewing. Reassure them you're not "spying" on them or trying to control their every move. If they know you're motivated by love and concern, they'll feel more comfortable with this request.


Remember

Children and teenagers can safely and effectively use AI technology. Until tech companies start to provide safe and developmentally appropriate programs for children, parents need to remain engaged and thoughtful about their child's AI usage.

Talk with your Kids First Provider if they seem withdrawn and prefer talking with chatbots rather than people. Raleigh: (919) 250-3478, Clayton: (919) 267-1499. 

Therapy and other social skills opportunities can help your child feel more comfortable in the social world.


More resources



 *This article is informational. It is not a substitute for medical attention or information from your provider.


Comments


RALEIGH LOCATION

4109 Wake Forest Rd

STE 300
Raleigh NC, 27609

Phone: (919) 250-3478
Fax: 1–866-224-0754

kf logo.png
  • Instagram
  • Facebook

CLAYTON LOCATION

400 Athletic Club Blvd.
Unit 101
Clayton NC

Phone: (919) 267-1499
Fax: 1–866-224-0754

bottom of page