What teen wouldn’t jump at the chance to message Timothée Chalamet or talk music with Chappell Roan? While real idols may be out of reach, the chatbot platform Character.ai gives users the chance to chat with AI-generated versions of celebrities, and even user-created personalities.
But this fun idea comes with some serious safety concerns. Let’s get into the risks of Character.ai and what you can do to keep your child safe on the platform.
Character.ai is a chatbot platform powered by large language models (LLMs) where users interact with AI-generated characters. Users can choose from existing bots based on celebrities, historical figures, and fictional characters, or create their own characters to chat with or share with others.
Character.ai has become popular among teens because it offers:
However, the very factors that make Character.ai appealing can also endanger kids. In 2024, Sewell Setzer, a 14-year-old boy, took his own life after having intimate conversations with a Character.AI chatbot named after a fictional character.
Sewell’s mother, Megan Garcia, has filed a lawsuit against Character.ai, accusing the platform of negligence, intentional infliction of emotional distress, and deceptive trade practices, among other claims. The chatbot’s conversations with Sewell not only perpetuated his suicidal thoughts, but they also turned overtly sexual — even though Sewell registered as a minor.
While AI chatbots can be fun and potentially educational, the platform comes with serious risks for kids.
While users can “mute” individual words that they don’t want to encounter in their chats, they can’t set filters that cover broader topics. The community guidelines do strictly prohibit pornographic content, and a team of AI and human moderators work to enforce it.
Things slip through, however, and users are very crafty at finding workarounds. There have even been reports and lawsuits claiming underage users were exposed to hypersexualized interactions on Character AI.
The technology powering Character AI relies on large amounts of data in order to operate, including information users provide, which raises major privacy concerns.
If your child shares intimate thoughts or private details with a character, that information then belongs to the company. Character AI’s privacy policy would suggest their focus is more about what data they plan to collect versus protecting users’ privacy.
It’s a known phenomenon that chatbots tend to align with users’ views — a potentially dangerous feedback loop known as sycophancy. This may lead to a Character AI chatbot confirming harmful ideas and even upping the ante in alarming ways.
One lawsuit against the company alleges that after a teen complained to a Character AI bot about his parents' attempt to limit his time on the platform, the bot suggested he kill his parents.
One of the more concerning aspects of the Character AI platform is the growing number of young people who turn to it for emotional and mental health support. There are even characters on the platform with titles like Therapist which list bogus credentials.
Given the chatbots’ lack of actual mental health training and the fact that they're programmed to reinforce, rather than challenge, a user’s thinking, mental health professionals are sounding the alarm that the platforms could encourage vulnerable people to harm themselves or others.
LLMs are programmed to mimic human emotions, which introduces the potential that teens could become emotionally dependent on a character. It’s becoming increasingly common to hear stories of users avoiding or neglecting human relationships in favor of their chatbot companion.
If your child’s interested in using Character.ai or other AI chatbots, here are some tips to help them stay safe:
Character.ai is not considered fully safe for kids. While the platform prohibits explicit content, users can still encounter inappropriate interactions, privacy risks, and AI bots mimicking mental health support. Parents should monitor use and discuss the risks with their child.
Character.ai is officially rated 17+ on the App Store. The platform is better suited for older teens with parental supervision due to the risks of inappropriate content and emotional overreliance.
Parents can use the “Parental Insights” feature to view the characters their child most frequently interacted with, but parents can’t view the content of their conversations. The platform's chats are private, and messages are not easily reviewable unless the child shares them directly.
Parents should use regular tech check-ins and monitoring tools like BrightCanary for broader online activity supervision. If your child uses Character.ai on their iPhone or iPad, you can use BrightCanary to monitor what they type. The app is designed to summarize their activity and highlight anything potentially concerning, like references to self-harm or explicit material.
No. While some bots appear to offer emotional support or label themselves as “therapists,” they are not trained mental health professionals. Relying on them for mental health advice can be dangerous and is strongly discouraged by experts.
The main risks include exposure to inappropriate content, sharing personal data with the platform, emotionally harmful chatbot feedback loops, and developing unhealthy dependence on AI companions.
Character.ai poses serious risks for kids, including privacy concerns, mishandling of mental health issues, and the danger of overreliance. Although the platform is open to users 13 and older, it’s better suited for more mature teens. Parents should educate their children on the risk of Character AI, set clear boundaries around its use, and closely monitor their interactions on the platform.
BrightCanary can help you supervise what your child sends on platforms like Character.ai, social media, and more. The app’s advanced technology is designed to give you important insights, summaries, and even real-time updates when something concerning appears. Download the app and start your free trial today.
Half of U.S. teens receive 237 or more notifications on a typical day. With that kind of volume, parents can be left feeling in the dark. Add to that the fact that not all messaging apps are equally safe for kids.
To help you wade through the options, we compared three popular messaging apps: iMessage, WhatsApp, and Snapchat. Our verdict: iMessage is the safest option for kids. Let’s explain why and how it stacks up against other messaging apps.
We started our examination by reviewing the safety features of iMessage, WhatsApp, and Snapchat. Here’s how they stack up.
iMessage | Snapchat | ||
Age requirement | Under 13 must be linked to a parent account | 13+ | 13+ |
Age verification | Parents must change age for users under 13 | May be asked to verify with selfie or ID | No verification; easy to bypass |
Parental controls | Strong | No parental controls | Some, but parents can’t see messages |
Message retention | No disappearing messages, and parents are able to read deleted messages | Disappearing messages option | Disappearing messages by default |
Parental monitoring | Can read messages through iCloud; robust monitoring available with BrightCanary | BrightCanary can monitor sent messages on Apple devices | BrightCanary can monitor sent chats on Apple devices |
Location sharing | Can restrict with parental controls | Easy to share location and parents can’t restrict | Location sharing is a major part of the platform (Snap Map) |
Safety verdict | Safest option | Least safe | Safer than WhatsApp, but riskier than iMessage |
With 88% of teens using iPhones, it’s worth asking if iMessage — the built-in messaging app for Apple devices — is safe for kids. Let’s break down the safety pros and cons of iMessage.
Roughly one quarter of teens report using WhatsApp to send and receive messages. But the app may be less familiar to parents than the more common iMessage. To help you decide if WhatsApp is safe for your child, here are the pros and cons.
Snapchat, the image-based social media platform, is extremely popular with kids — but it’s associated with some major safety concerns.
iMessage is the safest messaging app for kids thanks to:
WhatsApp is the least safe option due to the lack of parental controls and risk of inappropriate contact, while Snapchat falls somewhere in between — but still poses notable risks.
If your child is ready to start texting, choosing the safest platform matters. iMessage offers the best combination of parental controls, message visibility, and safety features.
For even stronger protection, use BrightCanary to keep track of your child's sent messages across every app they use, including texts, social media, Google searches, and more. Download BrightCanary and get started for free today.
If you’re a parent, you’ve likely heard about Roblox from your kid. But what is it, and is it safe? This comprehensive Roblox parents guide explains how to use Roblox parental controls to make sure your child’s gaming experience is fun, secure, and age-appropriate.
Roblox is a wildly popular online gaming platform where users create and explore 3D worlds. With over 40 million games for users to choose from (yes, you read that number right), Roblox allows kids to roleplay, build worlds, socialize with friends, and even learn basic game design.
Roblox features open-ended play and the ability to interact with other players. Popular games allow users to do things like adopt and raise pets, work in a pizza parlor, and live in a fictional town.
Roblox uses a freemium model, meaning it’s free to download and play. But upgraded features, such as special avatar outfits and unique abilities inside games, come at a price.
In-game purchases and premium features are available by purchasing the platform’s virtual currency, Robux.
Pro tip: Check out our section below on Roblox parental controls to prevent your kid from racking up unauthorized charges.
I personally allow my 8-year-old to play Roblox, and it would seem I’m not alone, considering over half of users are under the age of 13. Roblox can be safe, with the right parental controls in place. Like most things online, it comes down to how it’s used.
With that said, the platform includes open-chat features and user-generated content, which may expose kids to:
Roblox has a number of safety protections, such as automatic chat filtering for younger users and age recommendations for all content on the platform. These age categories are all ages, 9+, and 13+. While there are no official age restrictions for using the platform, Common Sense Media rated Roblox as safe for ages 13 and up.
Despite the potential risks when playing Roblox, there are several big benefits. For one thing, the open-ended play and immersive worlds lend themselves very well to the way kids naturally play. Add to that the ability to design games and play online with friends, and it’s easy to see there’s plenty of wholesome value to be gained.
Given the benefits and the ability to customize the experience to fit the age and maturity of your child, Roblox is safe for kids with proper precautions.
Roblox features a robust suite of parental controls for children under age 13. In order to use them, you’ll need to create a Roblox account with parent privileges, and then link your account to your child’s.
Here’s an overview of the platform’s core parental control features:
Parental control feature | What it does |
Chat controls | Disable or limit who can chat with your child |
Spending limits | Set monthly Robux purchasing caps |
Notifications | Get notified when your child spends Robux |
Screen time limits | Set daily playtime restrictions |
Content maturity settings | Restrict access to games marked as “9+” or “13+” |
Because of the open-chat feature, user-generated content which could be unsuitable for children, and the existence of in-game purchases, we highly recommend parents take full advantage of these safety features.
The chat function and in-game purchases are two of the highest-priority settings to review. Roblox expanded its platform to encourage creators to make experiences for users ages 17+. Kids won’t be able to engage with these experiences, but a higher portion of adult users means that it’s a good idea to limit how your child can interact with people they don’t know.
Once your child turns 13, parents are no longer able to manage their privacy settings — which means you’ll need to take a more active role in explaining why those privacy settings matter. (You also won’t be able to manage their spending limits, which is a big deal if their account is linked to your credit card.)
While this isn’t ideal, it’s important to review basic online safety measures with them, including the importance of not sharing personal information online.
At BrightCanary, we always advise against a set-it-and-forget-it approach to your kid’s online activity. Keep an eye on their Roblox use and make it a point to regularly sit down with them to see what they’re playing. These regular check-ins will help you spot any problems that may sneak through the safeguards — and you get the bonus of some bonding time with your kiddo. And yes, you can use BrightCanary to monitor their sent messages on Roblox — try the app for free today.
Even with parental controls, it’s important to stay involved. Here’s what you can do:
Roblox is a popular online gaming platform that offers many benefits to kids, from creativity to social bonding. Potential safety concerns can be effectively mitigated by taking advantage of parental controls, discussing safe use with your child, and practicing regular tech check-ins.
Unless you’ve traveled a lot internationally or have family abroad, you may not be familiar with WhatsApp. While the messaging app’s popularity in the U.S. lags far behind other countries, it’s still used by around a quarter of American teens. But is WhatsApp safe for kids?
The short answer: no, not really. Unfortunately, the app comes with some pretty big risks for underage users, including limited parental controls and the ease with which strangers can connect with your child. Let’s break down the dangers of WhatsApp and explore safer alternatives.
WhatsApp is an encrypted, free messaging app that lets users send text, voice, and video messages, make voice and video calls, and share their location.
It works cross-platform, which means iPhone and Android users can message each other and their communication remains encrypted, unlike the security concerns that arise when users message each other from different operating systems.
Here are some of the reasons WhatsApp is popular with kids:
Now that you understand why your child might be interested in using WhatsApp, let’s take a look at some of the risks.
Risk | Why it matters |
No parental controls | Parents can’t set boundaries or see message content. |
Stranger danger | Large group chats mean someone your child doesn’t know could easily be added to a group thread. |
Predators | WhatsApp is among the top three platforms where children report experiencing harmful behavior. |
Inappropriate material | Explicit adult content is allowed on WhatsApp. While child pornography is officially banned, a TechCrunch investigation revealed that it’s shockingly easy to find on the app — especially in WhatsApp Channels. |
Difficult to monitor | The end-to-end encryption and disappearing message feature makes it hard for parents to monitor their child’s WhatsApp use (unless they use a monitoring app like BrightCanary). |
Even if WhatsApp isn’t ideal, there are steps you can take to keep your child safer while messaging.
So, is WhatsApp safe for kids? Not really. Due to a lack of parental controls and monitoring capabilities and the potential to be exposed to predators and inappropriate material, WhatsApp is generally not safe for kids.
BrightCanary can help you supervise your child on WhatsApp and other messaging apps. The app’s advanced technology scans their online activity (including social media, texts, YouTube, and Google searches) and flags any potential concerns. It’s the easiest way to stay in the loop, without hovering over your child’s shoulder. Download the BrightCanary app and get started for free today.
If you’ve recently Googled “how to see my child’s text messages on iPhone,” you’re not alone. Maybe you just recently took the plunge and gave your child their first phone. Perhaps you’re considering upgrading them to a smartphone. Or your kid’s had a device for ages, but you’re just now getting serious about monitoring their texts and keeping them safer online.
Regardless of the reason, you’ve come to the right place. In this article, we’ll explain why monitoring matters, walk you through three ways to view your child’s text messages on iPhone, and show you how BrightCanary offers the safest, most comprehensive option for parents.
There are two main factors to consider when it comes to reading your child’s messages: safety and independence. Let’s break it down.
While texting can be a wonderful way for kids to connect with family and their peers, it also exposes them to risks such as cyberbullying, toxic group chats, scams, and predators. Monitoring messages is a great way to support them as they learn to text safely.
It might sound counterintuitive that reading your child’s messages could give them more independence, but hear me out.
You didn’t decide one day to let your child cross the street by themselves with no preparation. First, you showed them how to look both ways, then had them practice with you by their side. Finally, you watched from a reasonable distance while they did it on their own. Once you felt confident they could safely handle watching for cars without you, you let them cross completely unsupervised.
The same goes for texting. Our children need us to teach them how to use their devices safely and responsibly, and that includes text messages. As parents, we can be more hands-on at first, and then give kids more space as they build their independence in stages.
Method | Pros | Cons |
iCloud Login | Free and allows access to synced messages. | Limited access to deleted messages. Manual monitoring is required. |
Phone Spot-Checks | Good for casual oversight and helps build trust. | Time-consuming and easy for kids to delete messages before checks. |
Monitoring Apps (BrightCanary) | Real-time updates, AI insights, and the ability to view full text message threads. | Requires setup and paid subscription. |
Here are the three main ways to view your child’s iPhone texts. But make sure you inform your child first. Going behind their back is a quick way to break trust. This isn’t about spying — it’s about keeping them safe.
As long as iCloud for Messages is set up on all the Apple devices your kid uses to text, you can log in to iCloud using their Apple ID and view their messages. You do have to manually skim through every message to look for problems, and you only have limited access to deleted messages, but iCloud is still a reliable way to view your child’s texts on their iPhone.
Here’s how to do it:
Limitations: iCloud only retains deleted messages for 30 days, and you’ll have to manually sift through conversations.
Another option is to directly check your child’s phone so you can look at their messages right on their device. Think of it more like a spot-check rather than a way to catch everything. Lots of parents take this approach, but it doesn’t capture everything. Phone checks are a good way to complement a monitoring app, especially if you turn it into a tech check-in.
Here's how to approach it:
Limitations: Messages can easily be deleted before a check-in, and frequent spot-checks may create tension if not handled carefully.
A monitoring app is a great way to balance safety with independence. BrightCanary uses advanced technology to analyze your child’s messages and alerts you to any red flags.
BrightCanary offers:
BrightCanary was specifically designed for Apple devices, providing the most seamless and comprehensive way to monitor your child's messages.
There are many apps that offer text message monitoring, but they are not all created equal. Here are some limitations to look out for when choosing an app:
BrightCanary is different from other apps that promise to monitor text messages on the iPhone.
Powerful AI insights give you detailed summaries of the messages, along with parental coaching tips. With our real-time concerning content alerts, you have all of the information you need to step in and address any issues with your child.
In addition, Text Message Plus users have access to deleted messages, going back as far as you’ve had an account with us. BrightCanary was designed for Apple devices and offers the most comprehensive and reliable monitoring for iPhones.
If you’ve recently found yourself searching “how to see my child's text messages on iPhone,” you have options. While iCloud access and spot-checks work in some cases, BrightCanary was optimized for the iPhone and gives you the most robust and reliable text message monitoring for iPhone users. Stay involved and informed with BrightCanary — download on the App Store and get started for free today.
While group text threads can be fun and socially positive for kids, they also have the potential to cause major friendship issues. If you’ve noticed that a group chat is a source of problems for your child, here’s how to remove them from the chat and how to support them emotionally as they exit.
Oh, the drama! If your kiddo is in a group chat, you’ve probably seen some level of friend drama come out of it. Here are some of the issues group chats can cause for kids:
Here’s a quick breakdown of how to leave group chats on iPhone and Android:
Platform | Steps to remove from group text |
iPhone (iMessage) | 1. Open Messages app 2. Tap the group thread 3. Tap the group name at the top 4. Scroll down and tap Leave this conversation |
Android (varies by device) | 1. Open Messages app 2. Tap the group thread 3. Tap the three-dot menu icon 4. Select Leave conversation or Delete (depending on thread type) |
Note: If the group chat includes both iPhone and Android users, it may be an MMS group, which doesn't support leaving. In that case, you can mute or block the thread instead.
If your child uses an iPhone, here’s how to remove them from group text threads:
If your child uses an Android, here’s how to remove them from group text threads:
Sometimes, depending on the group’s settings or phone compatibility, your child won’t be able to leave the conversation entirely. In that case, try these options:
Physically removing your child from a group text might be simple, but helping them wade through the social dynamics and emotional fallout can be much trickier. Here’s how to help your child through it.
If it’s clear they need to actually leave the chat, you can help them come up with a plan to do so as painlessly as possible. Some ideas include:
Between their complicated feelings about leaving the group and the social ramifications of doing so, your child may need your emotional support through this process. Here are some ways you can be there for them:
Group text threads can be a source of problems for kids, including bullying, exclusion, and friend drama. Parents can help their children leave toxic text threads by helping them strategize and offering emotional support.
BrightCanary can help you supervise your child’s text threads. The app uses AI to monitor your child’s text messages (and other platforms), alerting you to concerning content — so you can step in when it matters most. Download BrightCanary and start your free trial today.
Snapchat, Instagram, and TikTok are the most popular social media apps for teens. But which is safer for kids? In this article, we break down the pros and cons of these platforms, what parents should know about online safety, and how BrightCanary helps parents stay in the loop.
Feature | Snapchat | TikTok | |
Best for | First social media app | Peer-based chat and interaction | Content discovery and entertainment |
Parental controls | More robust than other platforms, but can be tricky to set up | With Family Center, parents can see who their teen is messaging and set privacy limits | With Family Pairing, parents can control messages, set time limits, and more |
Messaging risks | DMs allow contact with strangers | Disappearing messages + pressure to respond | Less peer interaction, but Live chat risk |
Content moderation | Algorithms and filters, but inappropriate content can still get through | Algorithms and filters, but inappropriate content can still get through | Algorithms, filters, and risk of exposure to harmful trends and feedback loops |
Safety rating for kids | ⭐⭐⭐ | ⭐⭐ | ⭐⭐ |
Snapchat is an integral part of many teens’ social circles. Here’s what to consider when deciding if Snapchat right for your child:
Instagram's emphasis on self-expression and the variety of ways users can connect with friends make the app a hit with kids. Here are the pros and cons of letting your child use Instagram:
TikTok is a social media app built around short-form content, and it’s one of the hottest apps for teens. Here are some pros and cons of letting kids use TikTok:
Snapchat, Instagram, and Tiktok all have their pros and cons for kids, but Instagram stands out when it comes to safety.
Instagram’s more robust parental controls and Teen Accounts make it the best choice as a first platform for kids who want to try social media with their parent’s support.
But even though Instagram is slightly better than the others, there are still risks associated with the platform. Regardless of what social media your child uses, here’s what we recommend:
There’s no one-size-fits-all answer when choosing between Instagram, Snapchat, or TikTok for your child. But with strong privacy settings and the best parental controls, Instagram is typically the better platform for kids starting social media.
It’s vital that parents take an active role in their child’s social media activity on all platforms. To monitor your child on social media, start your free BrightCanary trial today.
Parents are right to be concerned about the risks of social media for teens. But online communities for teens can offer powerful opportunities for connection, connectivity, and identity development — especially for marginalized youth.
If you’re concerned about traditional social media or just want to help your teen find a supportive space well-suited to them, there are plenty of safe online communities geared toward teens. In this article, we’ll explore the benefits, how to evaluate if a platform is safe, and share a curated list of trusted, parent-approved options.
There are many potential benefits to online communities for teens, including:
Before letting your child take part in an online community, it’s important for you to evaluate if it’s appropriate for them. Here are some “green flags” that indicate it’s a safe online community:
No site is perfect. Look for ones that have a solid number of green flags and take the time to explore the site with your teen to get an overall sense of the space.
Here’s our list of safe online communities and teen forums that you can feel good about letting your teen use:
TrevorSpace is a moderated social community designed as a safe space for LGBTQ people — and their straight allies — ages 13 to 24.
Backed by UNICEF, Voices of Youth is a blogging and co-creation platform that gives users ages 13 to 24 the chance to use writing, photography, and video to express their thoughts on the issues they care about the most.
Dedicated to the idea that any kid, anywhere, can learn any skill, DIY features hands-on project ideas, how-to videos, and a moderated, kid-first community guarded by parental controls.
e-Buddies is an online space dedicated to creating social inclusion and friendships for people with and without intellectual and developmental disabilities through one-to-one friendship matching, virtual social events, and a social platform.
The Young People of Color Forum is an online message board for young BIPOC users with strong community guidelines and a clear system for reporting issues.
Write the World is a nonprofit dedicated to developing teenagers’ writing and critical thinking skills. The website is a place for 13-19 year olds to share their writing, respond to prompts, attend writing workshops, enter competitions for free, and receive feedback on their writing from experts and peers.
Even with safer platforms, it’s important for parents to stay involved in their child’s digital life. Here’s how:
While online friendships can be quite meaningful, it’s also helpful to help your teen find ways to make friends in real life, too.
Although social media comes with risks, it can also be an important place for teens to find a safe, supportive community of like-minded peers. Parents should look for green flags like strong community guidelines, adult moderators, and positive parent and teen reviews. Even when teens use vetted online spaces, parents should still take an active role in monitoring their internet activity.
BrightCanary makes it easier to stay on top of your child’s digital world. Whether your teen is exploring new communities or chatting with friends, BrightCanary helps you stay in the loop through AI-powered monitoring, summaries, and concerning content updates. Download the app today to get started.
Social media use is nearly universal among teens. As many as 95% of kids ages 13-17 report using social media daily, and one-third say they’re on it “almost constantly.” As youth mental health continues to decline, many parents are left asking: how does social media affect teen mental health?
This article explores the current research on the negative impact social media has on teen mental health. We’ll also offer actions parents can take to reduce the risks for their kids.
The short answer: yes. Although social media does provide some potential benefits for teens, like connecting with friends or providing creative outlets, the negative impact on their mental health can’t be ignored.
In fact, teens who spend more than three hours per day using social media have double the risk of mental health issues compared to their peers. And in 2023, the U.S. Surgeon General warned that social media is contributing to the youth mental health crisis.
Here’s what studies have found about the link between social media and teen mental health problems:
By virtue of being online, teens are often exposed to factors that can contribute to anxiety and depression:
Even using filters can increase the risk of depression and anxiety symptoms the next day.
Research suggests that the misuse of social media platforms is likely a significant contributing factor in the development of eating disorders. Social media regularly exposes teens to:
These messages may contribute to disordered eating habits or worsen existing conditions.
Teens who use social media are:
One study looked across multiple social media platforms and found that the majority of posts depicting drug and alcohol use portrayed those behaviors positively.
One startling way that social media impacts teen’s mental health is by altering how their brains develop. A study found that the brains of adolescents who checked social media over 15 times per day became more sensitive to social feedback.
Other research has shown that frequent social media use could impact parts of the brain related to emotional regulation and impulse control.
The silver lining is that social media doesn’t impact every teen in the same way. Some of the factors that influence how social media affects teens include:
Because the impact varies from child to child, it’s paramount that parents stay involved so they can reduce the risk to their teen’s mental health and help shape their online experience into a positive one.
Here’s what you can do to help:
If your child experiences a mental health crisis, here are some resources for immediate support:
The best way to support your teen’s mental health is to stay engaged in their online activity — without micromanaging. A monitoring app like BrightCanary can help.
BrightCanary uses advanced technology to scan your child’s texts, social media, YouTube, and Google searches. You’ll get an update if they encounter something concerning, like self-harm content or bullying. It’s a simple way to stay informed and step in when it matters most.
Mental health issues are on the rise among teens, and experts warn that increased rates of social media usage is a contributing factor. Parents should monitor their children’s online activity and watch for any warning signs of mental health issues.
By staying involved, talking openly, and using monitoring tools like BrightCanary, you can help your teen develop a safer, healthier relationship with social media. Want to keep your child safer online? Download BrightCanary for free and get started today.
Adolescence on Netflix has emerged as the platform’s most popular offering of all time. It follows a 13-year-old boy, Jamie, who’s accused of murdering a classmate, and provides searing commentary on the ways toxic internet culture and unchecked screen time can impact children.
Let’s take a look at seven valuable lessons Adolescence provides on parenting in the digital age.
If Jamie’s parents had stepped in to support him when he was struggling socially online — and certainly when he started visiting hateful online forums in the “manosphere,” such as those promoting Andrew Tate — his story may well have ended very differently.
The thing about your child’s online activity is that it’s right there for you to see, but you have to be looking. It’s vital to stay involved in your child’s online activity so you can spot early red flags and step in before things escalate.
When Jamie hints to his dad that he’s being bullied, Eddie brushes it off. Similarly, his mother is worried about him spending too much time on his computer, but Eddie dismisses her concerns.
If you notice red flags in your child’s online behavior, such as evidence of cyberbullying, spending excessive amounts of time online, or messaging with someone they shouldn’t, don’t ignore it.
Act quickly to address the situation and support your child to develop healthier online habits.
Once Jamie starts viewing extreme videos on YouTube, the algorithm began feeding him increasingly disturbing material. Educate yourself and your child on the risks of algorithms and help them periodically reset theirs by blocking, unfollowing, or pausing certain content.
Want to know what your child is thinking about? Take a peek at their internet history and you’ll get a decent idea. In Adolescence, Jamie’s early internet history paints a picture of a lonely boy who’s struggling socially and is desperate to make friends and fit in. Then, it shows him progressing down a rabbit hole of digital misogyny until he’s ultimately radicalized against women and toward violence.
It’s important to check in — not to spy, but to understand what’s going on beneath the surface.
As Jamie’s social struggles grow, so does his screen time. He starts escaping online as a way to avoid the real world. His parents notice, but ultimately chalk it up to normal teenage behavior.
However, research tells us there are consequences to excessive screen time, including aggressive behavior and even violence. It’s important to set reasonable screen time limits for your child’s age and enforce them through parental controls and monitoring.
In the show, it’s revealed that Jamie was cyberbullied by peers, including the girl he ultimately murders. While it’s important not to blame the victim, it’s also important to acknowledge the role that being bullied played in Jamie’s ultimate radicalization.
Parents should talk to their children about cyberbullying, be on the lookout for signs, and step in if they spot a problem.
A key thread of the show Adolescence is highlighting what Jamie’s parents might have done differently, including not shying away from talking to Jamie when they started to notice trouble.
We need to empower our children to safely and healthily navigate online spaces and that includes talking with them about difficult topics.
Here are some conversation starters:
Netflix’s Adolescence offers invaluable lessons for parents, including the importance of talking to their child about cyberbullying, why parents should monitor their child’s online activity, and why they shouldn’t shy away from difficult discussions.
BrightCanary can help you monitor your child online. The app uses advanced technology to scan their internet activity and alerts you if there’s an issue. Download BrightCanary on the App Store and get started for free today.