Why Are Teens Turning to AI Chatbots for Companionship?

By Andrea Nelson
May 9, 2025
Teen talking to AI chatbot companion in bed after dark

In the 2013 film Her, a lonely man falls for a Siri-like operating system. What once felt like a wild sci-fi notion has become a reality, and it’s particularly risky for our teens. 

Tens of millions of people, including many young people, are turning to artificial intelligence (AI) chatbots for love and companionship. But there’s a dark side to teen relationships with AI chatbots, including emotional dependency, social withdrawal, and unhealthy attitudes toward actual relationships. 

Here’s what parents need to know about teens seeking companionship from AI chatbots. 

What are AI chatbot companions?

AI chatbot companions are custom, digital personas designed to give a lifelike conversational experience, provide emotional support, imitate empathy, and even express romantic sentiments. 

Some of the biggest companies in the game are Replika, Dan AI, and Character.AI. Estimates expect the number of users of these platforms will dramatically increase within the next five years. 

Why are teens seeking companionship with AI chatbots?

Teens may seek chatbot companionship for a variety of reasons, including: 

  • Loneliness. Roughly 73% of Gen Z says they struggle with loneliness. It’s gotten so dire that the Surgeon General has called it a loneliness epidemic. Chatbots present a way for teens who feel isolated and alone to find emotional intimacy — albeit manufactured. 
  • Marketing. Advertisements often portray users as lonely and unable to form connections in the real world. They promise their product as the antidote.
  • Rejection. Replika advertises their companions as “always on your side” and “always ready to listen and talk.” Such features may be enticing to teens who’ve experienced rejection. 
  • A simpler version of love. AI chatbots and the large language models they’re built on are notorious for their tendency to agree with and affirm the views of users. Teens who struggle with relationships might be drawn to flattery and agreeability. 

What are the risks of teens using AI chatbots for companionship? 

Teens teens face many risks when using AI chatbot companions, such as:

Dependency 

AI companions stimulate the brain’s reward pathways. Too much of this reinforcement can lead to dependency and make it hard for a teen to stop using the program. 

Social withdrawal

Excessive time spent with an AI companion can reduce the time teens spend on genuine social interactions.

Emotional attachment

AI's ability to remember personal details, imitate empathy, and hold what can seem like meaningful conversations can cause emotional attachment, leading to further dependency and social withdrawal. 

Mental health 

In comparison to the highly personalized experience of interacting with a chatbot, real-life interactions may seem too difficult and unsatisfying, which can lead to teen mental health problems such as loneliness and low self-esteem. 

Unhealthy attitudes toward relationships 

Relationships with AI lack the boundaries and consequences for breaking those boundaries that human relationships have. This may lead to unhealthy attitudes about consent and mutual respect. 

Intolerance to conflict and rejection

Because AI companions are highly amenable, overuse of them can cause teens to become intolerant to the conflict and rejection inherent in human relationships. All this can impede a teen’s ability to form healthy relationships in real life.

Encouragement of dangerous ideas 

AI’s tendency to agree with users may lead chatbots to confirm or even encourage a teen’s dangerous ideas. For example, one lawsuit against Character.AI alleges that after a teen complained to his chatbot companion about his parents' attempt to limit his time on the platform, the bot suggested he kill them. 

Inappropriate sexual content

Common Sense Media found that AI chatbots not only engaged in explicit conversations, but also engaged in acts of sexual role-play with minors. AI companion bots are largely unregulated, which means that there aren’t filters and controls to the same extent that teens might encounter on other platforms. 

How can I protect my teen from the dangers of AI chatbot companions? 

Parents play a crucial role in protecting their teens against the risks of AI chatbot companions. Here are some actions you can implement today: 

  • No chatbot companions for kids under 17. Younger children are more susceptible to the dangers of chatbot and less able to manage the risks. Due to the lack of controls on most AI chatbots, Common Sense Media says that social AI chatbots should not be used by kids at all.
  • Set boundaries. If your older teen uses AI companion bots, set specific screen time limits, and don’t allow unsupervised access. 
  • Talk about privacy. Be clear your teen should never share personal information with an AI chatbot. 
  • Open communication. Discuss the risks of AI, but also offer a judgement-free zone for your teen to talk about their experiences online.
  • Watch for red flags. Look for inappropriate use of AI chatbots or signs of unhealthy attachments, such as sneaking around screen time limits and spending too much time on their devices.
  • Monitor their use. Keep an eye on the content your child sends and receives. A monitoring app like BrightCanary can help you stay on top of your teen’s AI chatbot use. 

In short

AI chatbot companions present a tempting escape for teens, especially ones who are lonely or struggle with social interactions. But these platforms present real risks to teens, such as emotional dependency, social withdrawal, and the reinforcement of dangerous ideas. As more and more teens turn to chatbots, parents need to take proactive steps to protect their teens and monitor their use for warning signs. 

BrightCanary is the only Apple monitoring app that easily allows parents to supervise what their teen is doing on popular platforms. Download BrightCanary on the App Store and get started for free today.

Instagram logo iconFacebook logo icontiktok logo iconYouTube logo iconLinkedIn logo icon
Be the most informed parent in the room.
Sign up for bimonthly digital parenting updates.
Please enable JavaScript in your browser to complete this form.
@2024 Tacita, Inc. All Rights Reserved.