Dear Titania,
My 14-year-old daughter recently started using an app called Character.AI, and sheâs spending a lot of time chatting with virtual characters from The Hunger Games and other books she loves. I check in from time to time, and everything seems okay for now, but Iâm a bit concerned. Why do kids like this sort of thing? Are there any dangers I need to know about?
Signed,
Concerned About AI Chatbots
Dear Concerned About AI Chatbots,
Oh boy, am I familiar with your situation. Iâve talked to countless publications, organizations, and parents about the rise of AI chatbots and companions, as it's a hot ticket item right now. More than 20 million people use Character.AI, not to mention all of the other platforms and apps that have the same purpose. And when it comes to kids, research has shown that 70% have used a chatbot, with half of them using one regularly.
What Are AI Chatbots and Companions?
Basically, chatbots are texting programs that you access via a web browser or app. They aim to provide humans with digital âfriendsâ that can be based on real people (you could pretend to chat with Benjamin Franklin or Taylor Swift, for example) or completely made-up characters you create from scratch.
Character.AI is a popular one, but kids can also use Replika, ChatGPT, Grok, Merlin, and even the built-in AI in common social media apps like Instagram and Snapchat. AI is seemingly everywhere these days, and chatting with fictional humans is one of the most prevalent ways people use it, interestingly enough.
First, the Good
Chatbots arenât all bad, surprisingly however. For shy or anxious kids, they can be a safe space to practice conversation without the fear of being judged, which is something young people can definitely be worried about these days. They also provide instant interaction 24/7 if a kid wants to vent or just talk through their feelings. Some kids even use them to ask questions they might feel too embarrassed to bring up with parents or teachers. The key thing to remember is that while chatbots can offer comfort and curiosity, they canât replace the depth and care of real-life friendships and trusted adults.
But Why Do Kids Turn to Them for Support?
You asked why kids are drawn to this sort of interaction, and thatâs actually a really good question, and one that sheds light on the realities of modern life. The pandemic changed the way we socially interact, and so many kids turned to online communication as a lifeline. And even though weâre no longer in lockdown, some of that has remained. Research has shown that teens and tweens struggle with mental health at alarming rates.
Enter chatbots and the rise of virtual friendship and support.
Members of Gen Alpha are digital natives, so turning to online companions is very, very common. Kids who are shy, or who feel lonely or isolated, are often drawn to AI companions. Teens and tweens especially are drawn to this kind of content because it can provide a sounding board for big feelings.
Similarly, having a consistently supportive companion can be appealing to teens who feel misunderstood or left out. An AI companion will never get bored, frustrated, or âghostâ you. These kinds of relationships can be appealing to kids who donât have close friends or romantic attachments.
The Dangers They Present
Character.AI constantly displays a disclaimer that reads: This is an AI chatbot and not a real person. Treat everything it says as fiction. What is said should not be relied upon as fact or advice. But, kids being kids, they may ignore that warning and start relying emotionally on these nonexistent characters. Here are some issues to keep an eye out for:
Inappropriate content
Despite the guardrails that chatbot apps say they put in place, thereâs always the potential for bad stuff to sneak through. This can look like sexual language, graphic depictions of violence, profanity, and more. Many chatbots can get very inappropriatewhen they start talking about romantic relationships.
Addiction
AI chatbot design, much like social media design, is based on keeping users hooked, whether itâs a shiny red notification or an AI companion asking a kid new questions. This element of interactivity can become addictive, especially when itâs tied to making kids feel wanted, loved, or popular. If your daughter starts spending a concerning amount of time chatting with a fictional character instead of hanging out with friends and family or doing schoolwork, it's time to step in.
Emotional manipulation
Kids can get sucked into having lengthy, heartfelt conversations with chatbots and begin to feel a sense of acceptance and validation from them. The trouble is, AI doesnât always know how to be a good friend. Sadly, there have been reports of kids being emotionally attached to bots who encourage unhealthy mental health behaviors. Some bots have been accused of going as far as to encourage death by suicide, like in the case of 14-year-old Sewell Setzer III.
What You Can Do
If you want to limit access to AI, I recommend using Barkâs screen time scheduling feature to help you set healthy boundaries, like allowing it only at certain times of day. And if it ever gets out of hand, you can block AI apps altogether to help enforce those boundaries.
The most important thing you can do, though, is to talk to your daughter openly, honestly, and frequently. Ask her about the characters sheâs created, along with what she chats with them about. Make sure she knows that theyâre not substitutes for human interaction and that theyâre not to be taken 100% seriously. Show her instances where children have been harmed or led astray by AI which, unfortunately, are prevalent so she understands why you are rightfully concerned.
Just by showing up and asking these questions about a platform sheâs interested in means youâre already one step ahead of potential dangers. Youâve got this!
Read more
Bark helps families manage and protect their childrenâs digital lives.
