How to Reset Your Child’s Social Media Algorithm

By Andrea Nelson
October 19, 2023
Tween girls taking selfies together

As a parent, you want your child to surround themselves with good influences. That’s true not only for who they spend time with in real life, but also for the people and ideas they’re exposed to on social media. 

If you or your child are concerned about the content appearing in their feed, one beneficial step you can take is to help them reset their social media algorithm. Here’s how to reset your child’s algorithm on TikTok, Instagram, and other platforms.

What is a social media algorithm?

Social media algorithms are the complex computations that operate behind the scenes of every social media platform to determine what each user sees. 

Everything on your child’s social media feed is likely the result of something they liked, commented on, or shared. (For a more comprehensive explanation, check out our Parent’s Guide to Social Media Algorithms.)

Social media algorithms have a snowball effect. For example, if your child “likes” a cute dog video, they’ll likely see more of that type of content. However, if they search for topics like violence, adult material, or conspiracy theories, their feed can quickly be overwhelmed with negative content.

Therefore, it’s vital that parents actively examine and reset their child’s algorithm when needed, and also teach them the skills to evaluate it for themselves. 

Research clearly demonstrates the potentially negative impacts of social media on tweens and teens. How it affects your child depends a lot on what’s in their feed. And what’s in their feed has everything to do with algorithms. 

Talking to your child about their algorithm

Helping your child reset their algorithm is a wonderful opportunity to teach them digital literacy. Explain to them why it’s important to think critically about what they see on social media, and what they do on the site influences the content they’re shown. 

Here are some steps you can take together to clean up their feed: 

Start with their favorite app

Resetting all of your child’s algorithms in one fell swoop can be daunting. Instead, pick the app they use the most and tackle that first. 

Scroll through with them

If your kiddo follows a lot of accounts, you might need to break this step into multiple sessions. Pause on each account they follow and have them consider these questions:

  • Do this person’s posts usually make me feel unhappy or bad about myself? 
  • Does this account make me feel like I need to change who I am? 
  • Do I compare my life, body, or success with others when I view this account? 

If the answer “yes” to any of these questions, suggest they unfollow the account. If they’re hesitant — for example, if they’re worried unfollowing might cause friend problems — they can instead “hide” or “mute” the account so they don’t see those posts in their feed. 

Encourage interaction with positive accounts 

On the flip side, encourage your child to interact with accounts that make them feel good about themselves and portray positive messages. Liking, commenting, and sharing content that lifts them up will have a ripple effect on the rest of their feed. 

Dig into the settings 

After you’ve gone through their feed, show your child how to examine their settings. This mostly influences sponsored content, but considering the problematic history of advertisers marketing to children on social media, it’s wise to take a look.  

Every social media app has slightly different options for how much control users have over their algorithm. Here's what you should know about resetting the algorithm on popular apps your child might use.

How to reset Instagram algorithm

  • Go to Settings > Ads > Ad topics. You can view a list of all the categories advertisers can use to reach your child. Tap “See less” for ads you don’t want to see. 
  • Go to your child’s profile > tap Following > scroll through the categories to view (and unfollow) the accounts that appear most in your child’s feed.
  • Tap the Explore tab in the bottom navigation bar and encourage your child to search for new content that matches their interests, like cooking, animals, or TV shows.

How to reset TikTok algorithm

  • Go to Settings > Content Preferences > Refresh your For You feed. This is like a factory reset of your child’s TikTok algorithm.
  • Go to Settings > Free up space. Select “Clear” next to Cache. This will remove any saved data that could influence your child’s feed.
  • As your child uses TikTok, point out the “Not Interested” feature. Tap and hold a video to pull up this button. Tapping “Not interested” tells TikTok’s algorithm not to show your child videos they don’t like. 

How to reset YouTube algorithm

  • Go to Library > View All. Scroll back through everything your child has watched. You can manually remove any videos that your child doesn’t want associated with their algorithm — just then tap the three dots on the right side, then select Remove from watch history.
  • Go to Settings > History & Privacy. Tap “Clear watch history” for a full reset of your child’s YouTube algorithm.

What to watch for

To get the best buy-in and help your child form positive long-term content consumption habits, it’s best to let them take the lead in deciding what accounts and content they want to see. 

At the same time, kids shouldn't have to navigate the internet on their own. Social platforms can easily suggest content and profiles that your child isn't ready to see. A social media monitoring app, such as BrightCanary, can alert you if your child encounters something concerning.

Here are a few warning signs you should watch out for as you review your child's feed: 

If you spot any of this content, it’s time for a longer conversation to assess your child’s safety. You may decide it’s appropriate to insist they unfollow a particular account. And if what you see on your child’s feed makes you concerned for their mental health or worried they may harm themselves or others, consider reaching out to a professional.  

In short 

Algorithms are the force that drives everything your child sees on social media and can quickly cause their feed to be overtaken by negative content. Regularly reviewing your child’s feed with them and teaching them skills to control their algorithm will help keep their feed positive and minimize some of the negative impacts of social media. 

Woman smiling at phone while sitting on couch

Just by existing as a person in 2023, you’ve probably heard of social media algorithms. But what are algorithms? How do social media algorithms work? And why should parents care? 

At BrightCanary, we’re all about giving parents the tools and information they need to take a proactive role in their children’s digital life. So, we’ve created this guide to help you understand what social media algorithms are, how they impact your child, and what you can do about it. 

What is a social media algorithm? 

Social media algorithms are complex sets of rules and calculations used by platforms to prioritize the content that users see in their feeds. Each social network uses different algorithms. The algorithm on TikTok is different from the one on YouTube. 

In short, algorithms dictate what you see when you use social media and in what order. 

Why do social media sites use algorithms?

Back in the Wild Wild West days of social media, you would see all of the posts from everyone you were friends with or followed, presented in chronological order. 

But as more users flocked to social media and the amount of content ballooned, platforms started introducing algorithms to filter through the piles of content and deliver relevant and interesting content to keep their users engaged. The goal is to get users hooked and keep them coming back for more.  

Algorithms are also hugely beneficial for generating advertising revenue for platforms because they help target sponsored content. 

How do algorithms work? 

Each platform uses its own mix of factors, but here are some examples of what influences social media algorithms:

Friends/who you follow 

Most social media sites heavily prioritize showing users content from people they’re connected with on the platform. 

TikTok is unique because it emphasizes showing users new content based on their interests, which means you typically won’t see posts from people you follow on your TikTok feed. 

Your activity on the site

With the exception of TikTok, if you interact frequently with a particular user, you’re more likely to see their content in your feed. 

The algorithms on TikTok, Instagram Reels, and Instagram Explore prioritize showing you new content based on the type of posts and videos you engage with. For example, the more cute cat videos you watch, the more cute cat videos you’ll be shown. 

YouTube looks at the creators you interact with, your watch history, and the type of content you view to determine suggested videos. 

The popularity of a post or video 

The more likes, shares, and comments a post gets, the more likely it is to be shown to other users. This momentum is the snowball effect that causes posts to go viral. 

Why should parents care about algorithms? 

There are ways social media algorithms can benefit your child, such as creating a personalized experience and helping them discover new things related to their interests. But the drawbacks are also notable — and potentially concerning. 

Since social media algorithms show users more of what they seem to like, your child's feed might quickly become overwhelmed with negative content. Clicking a post out of curiosity or naivety, such as one promoting a conspiracy theory, can inadvertently expose your child to more such content. What may begin as innocent exploration could gradually influence their beliefs.

Experts frequently cite “thinspo” (short for “thinspiration”), a social media topic that aims to promote unhealthy body goals and disordered eating habits, as another algorithmic concern.

Even though most platforms ban content encouraging eating disorders, users often bypass filters using creative hashtags and abbreviations. If your child clicks on a thinspo post, they may continue to be served content that promotes eating disorders

Social media algorithm tips for parents

Although social media algorithms are something to monitor, the good news is that parents can help minimize the negative impacts on their child. 

Here are some tips:

Keep watch

It’s a good idea to monitor what the algorithm is showing your child so you can spot any concerning trends. Regularly sit down with them to look at their feed together. 

You can also use a parental monitoring service to alert you if your child consumes alarming content. BrightCanary is an app that continuously monitors your child’s social media activity and flags any concerning content, such as photos that promote self-harm or violent videos — so you can step in and talk about it.

Stay in the know

Keep up on concerning social media trends, such as popular conspiracy theories and internet challenges, so you can spot warning signs in your child’s feed. 

Communication is key

Talk to your child about who they follow and how those accounts make them feel. Encourage them to think critically about the content they consume and to disengage if something makes them feel bad. 

In short

Algorithms influence what content your child sees when they use social media. Parents need to be aware of the potentially harmful impacts this can have on their child and take an active role in combating the negative effects. 

Stay in the know about the latest digital parenting news and trends by subscribing to our weekly newsletter

teen boy watching youtube shorts on his iphone

When I was asked to answer whether YouTube Shorts is safe for kids, I was already aware of some risks. Reader, let me tell you, when I dug into the research, I was floored. 

Addiction, depression, sleep problems, and decreased attention span are just a handful of the dangers kids face from YouTube Shorts. 

In fact, YouTube Shorts can be equally problematic as other short-form video platforms like TikTok and Instagram Reels. What I learned will definitely cause me to rethink how I let my own child use YouTube, and I encourage you to do the same. 

What is YouTube Shorts?

YouTube Shorts allows users to create and view short-form videos. However, the viewing experience is far different from the longer videos that YouTube is most known for. 

Shorts are accessed through a dedicated, social media-like scrolling feed. Users can interact with the videos by liking, commenting, and sharing them.

What are the risks of YouTube Shorts? 

YouTube Shorts pose the same risks as longer videos on the app, like inappropriate content, cyberbullying, and exposure to predators. But the short-form nature of YouTube Shorts introduces additional risks, similar to the dangers kids face from platforms like TikTok. 

1. Addiction 

The concise, high-intensity, fast-paced, and visually captivating nature of short-form videos encourages an immersive experience, which can lead to compulsive viewing behaviors and even addiction

2. Mental health issues 

Studies have uncovered a direct correlation between addiction to short-form videos, like those on YouTube Shorts, and depression among adolescents. 

It’s important to emphasize that the videos themselves aren’t inherently the problem; it’s when viewing behavior becomes addictive that mental health problems emerge. 

Short-form video addiction is also linked with social anxiety in adolescents. 

3. Lowered attention span

Numerous studies show that short-form video platforms are associated with greater inattentive symptoms in children. 

Researchers suggest the frequent attention-switching that happens while watching these videos may decrease kids’ ability to focus on a singular task for prolonged periods. 

4. Sleep problems 

A recent study found that teens who exhibit more severe symptoms of short-form video addiction were also more likely to report poorer sleep quality

5. Dangerous feedback loops

In order to encourage continued engagement, YouTube’s algorithms frequently recommend videos similar to what users have already consumed. This creates a potentially dangerous feedback loop where viewers are primarily fed content that reinforces the same beliefs and opinions. 

These videos also encourage passive viewing rather than critical thinking and seeking out new information. This lack of exposure to different points of view can be particularly harmful to children and teens, who are still forming their worldview and sense of self. 

6. Reduced motivation in school 

Short-form video addiction can decrease students' motivation to learn and the sense of satisfaction and diminish the joy they get from the learning process. 

How to keep your child safe on YouTube Shorts

Despite the risks, I don’t plan on banning my child from using YouTube. But I will take additional steps to keep him safe on the platform. Here are some ideas you can try as well: 

1. Parental controls on YouTube Shorts

Google recently rolled out additional parental controls that allow you to limit the amount of time your child spends scrolling through YouTube Shorts or to block short videos altogether. 

These new controls, when used in combination with other YouTube parental controls, go a long way toward helping your child engage with the platform in a healthier manner. 

2. YouTube Kids (for younger children)

YouTube Kids doesn’t have Shorts, so keeping your child on this platform is a great option. YouTube kids is designed for users up to age 12. 

3. Restrict viewing to shared spaces

Requiring your child to watch YouTube in shared spaces, like the living room, makes it easier for you to keep an eye on what they view. 

4. Watch with them

Occasionally sit with your kid and watch YouTube with them to see what they’re interested in and what the algorithm is feeding them. 

5. Use a monitoring app

Even the most vigilant parent can’t catch it all. That’s why BrightCanary’s YouTube monitoring includes YouTube Shorts. The app reports on what your child watches and searches for on YouTube so you don’t have to vet every video yourself. Here’s how: 

  • Advanced technology automatically scans your child’s YouTube activity and sends you real-time alerts when they watch or search for something concerning, so you’ll know when you need to step in.
  • You can choose to review all of their YouTube activity or just videos flagged as concerning. 

Bottom line: Is YouTube Shorts safe for kids? 

YouTube Shorts can be safe for kids, provided parents take proper precautions. Utilizing parental controls, including limiting how long they can spend scrolling Shorts and monitoring their use, are two vital safeguards if you plan to let your child use the app. 

BrightCanary helps you monitor your child’s activity on the apps they use the most, including YouTube Shorts. Download today to get started for free.

Teen on TikTok

Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:

  • New research finds TikTok's mental health content is riddled with misinformation, and ADHD is the worst offender.
  • OpenAI plans to introduce adult content to ChatGPT, but the age-verification system is already misclassifying minors.
  • Why harmful content keeps reaching kids online — and what advertising has to do with it.

Digital parenting

🧠 More than half of TikTok’s ADHD content is misinformation: Online platforms are flooded with misleading or unsubstantiated mental health content, according to new research. On TikTok alone, 52% of ADHD-related videos and 41% of autism videos were found to be inaccurate. YouTube averaged 22% misinformation on the same topics. 

Content created by healthcare professionals was consistently more accurate, but professional voices represent only a small fraction of what's actually circulating on these platforms. (And that one influencer with the flashy editing and jump-cuts is way more engaging.) The content that spreads is the content that generates engagement, and emotionally resonant self-diagnosis videos do exactly that.

When teens absorb inaccurate information about mental health — especially about their own potential diagnoses — it can shape how they understand themselves, how they talk to doctors, and whether they seek the right kind of help. It can also normalize self-labeling in ways that feel affirming in the short term but complicate actual support down the road.

What parents can do: If your child brings home a TikTok-informed self-diagnosis, resist the urge to dismiss it outright. Instead, treat it as an opening: "That's interesting — what made you feel like that applies to you?" If the concern feels real, bring it to a professional rather than letting the algorithm be the final word.

🔞 OpenAI plans to introduce adult content to ChatGPT, but age-verification is already failing: OpenAI CEO Sam Altman announced that ChatGPT will begin allowing erotica for verified adults, with a rollout expected later this year. We’re not here to yuck anyone’s yum, but the concern — voiced loudly by, among others, billionaire Mark Cuban — is that the age-verification system isn’t there yet, , and kids will be the ones impacted most.

OpenAI’s age-verification system misclassifies minors as adults 12% of the time, and we’ve found that existing safety features on ChatGPT are a bust. Cuban’s perspective: "This isn't about porn. That's everywhere. Including here [on X]. This is about the connection that can happen and go into who knows what direction with some kid who used their older sibling's log in." (Case in point: Character.ai limited the way teens use its platform following lawsuits, but other explicit AI chatbot platforms like Polybuzz are thriving.)

For parents, the practical takeaway is the same one that applies to every platform that promises age-gating: the gate is not the protection. Your child's understanding of why certain content is harmful, and their ability to come to you when something feels wrong, is. BrightCanary monitors everything your child types across all apps, including ChatGPT — so if something concerning is happening, you'll know about it.

📺 Why harmful content keeps reaching kids — and what advertising has to do with it: There’s an economic reason for why platforms keep serving harmful content to kids, according to researchers writing in The Conversation: recommendation algorithms are designed to maximize engagement, not to distinguish between helpful and harmful content. And emotionally charged content (that which provokes fear, anxiety, outrage, or shock) consistently generates more engagement than neutral material. 

Because many social platforms are funded by advertising revenue, and advertising revenue depends on attention, the incentive to serve that content never goes away, regardless of what a platform's safety team is doing on the other side of the building. That’s one of the reasons the same issues keep recurring across different platforms and years, and why parental involvement remains essential regardless of what any platform promises. Curious to learn more? We've written about how social media algorithms work and how to talk to your kids about them.


Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.


Tech talks

Bullying doesn't always look like name-calling. Online, it can be subtler … and harder for kids to name. Use these conversation starters to check in. That last question is the most important one to get an honest answer to. 

  1. "Has anyone ever said something online that made you feel bad, even if it wasn't obviously mean?"
  2. "Have you ever seen someone get ganged up on in a group chat or in a game? What happened?"
  3. "If a friend was being left out or talked about online, would you say something? What would make that hard?"
  4. "Has anything like that ever happened to you?"
  5. "If something like that happened, do you know you could come to me without getting in trouble?"

What’s catching our eye

🔐 Kids aren't learning cybersecurity in school — but parents can fill the gap. Save these five practical ways to teach kids digital security at home, from modeling good habits yourself to teaching them to question what they see.

📋 Pinterest CEO Bill Ready is backing a social media ban for kids under 16. “As both a CEO and a parent, I believe we need to be honest: social media as it exists today is not safe for kids under 16,” Ready wrote on LinkedIn. “We need clearer rules, better tools for parents, and more accountability across the tech ecosystem.”

💔A 9-year-old in Texas died after attempting a social media challenge she had seen online. JackLynn Blackwell passed away on February 3rd after attempting the blackout challenge, a dangerous trend that has been circulating on social media platforms for years. The CDC has documented at least 80 child deaths connected to this challenge. We don't share this to frighten you — we share it because awareness is a form of protection. Dangerous viral challenges are rarely announced; they spread quietly through feeds and group chats. Knowing what's circulating and having an open line of communication with your child can make a difference.

Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:

  • Why are some experts saying that social media bans don’t work?
  • Good news (and bad news) from Instagram.
  • Plus, new data on how much time teens really spend on their phones at school.

Digital parenting

📱Good news from Instagram (yes, really): Instagram is rolling out a new feature that alerts parents when their teen searches for suicide or self-harm content — including phrases that suggest a teen may be at risk. The alerts will go to parents enrolled in Instagram's parental supervision tools. Instagram says they set the threshold to require multiple searches within a short window, while still erring on the side of caution. While that means some alerts may not reflect a real crisis, this is a meaningful step overall. If your teen is on Instagram, now is a good time to make sure you're enrolled in parental supervision so these alerts actually reach you.

⚠️Bad news from Instagram (there it is): According to court documents from the ongoing federal lawsuit in California, Meta's own internal survey found nearly 1 in 5 teens aged 13 to 15 reported seeing unwanted nudity or sexual images on Instagram. The same survey found about 8% of that age group had seen someone harm themselves or threaten to do so on the platform.

These are Meta's own numbers. It's a useful gut-check as Instagram rolls out new safety features: progress is real, and so is the distance still to go.

🚫 Should we ban teenagers from social media? Earlier this year, Australia rolled out its first-of-its-kind social media ban for kids under 16. Similar proposals are circulating in the US and UK. But some argue that we shouldn’t ban teens from social media because kids will always find their ways around them, enforcement is difficult, and waiting until a child turns 16 doesn't actually teach them how to navigate the internet safely. It just delays the moment they're dropped in.

Our take: Why not limit access and create better guardrails? Smarter regulation matters. Platforms don't need to give kids access to features engineered for compulsive use: endless scroll, autoplay, algorithmically turbocharged feeds. Age verification should be meaningful, not performative. And content moderation for minors needs real teeth. But regulation alone isn't a parenting strategy. The goal isn't to keep kids off the internet forever. It's to raise kids who can handle it. That requires ongoing conversations, not just app settings or age cutoffs. When you’re ready to start monitoring social media, start here.


Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.


Tech talks

Screen time tends to reach new heights as the school year hits its midpoint. Use these conversation starters to check in on how your teen is feeling about their digital habits … without it turning into a lecture:

  1. "Do you ever feel like your phone is a distraction at school? What do you do about it?"
  2. "If TikTok (or YouTube, or whatever they use most) disappeared tomorrow, what would you miss? What wouldn't you miss?"
  3. "Have you ever seen something online that made you feel bad — even if you didn't want to?"
  4. "What do you think about schools banning phones? Do you think it would help you or hurt you?"
  5. “How do you use your phone during homework time?”

What's catching our eye

😔 The deepfake crisis no one is talking about enough: New large-scale research from UNICEF, ECPAT, and INTERPOL found that at least 1.2 million children across 11 countries reported being victims of sexually explicit deepfakes in the past year.. This is an urgent and underreported crisis — and it's a reminder that online safety isn't just about screen time.

📊How much is your teen on their phone at school? More than an hour, on average — and most of that time is social media. A recent analysis of American teens found that adolescents aged 13 to 18 spend more than 8.5 hours daily on screen-based entertainment overall, with over an hour of phone use happening during the school day itself.

🤖 Teens still love TikTok: New Pew Research data puts some numbers to teen platform habits: 68% of teens ages 13–17 use TikTok, with roughly 1 in 5 saying they're on it almost constantly. About 1 in 5 teens also report nearly constant YouTube use, and 64% of teens use AI chatbots (about 3 in 10 do so daily).

Person looking at phone with Meta logo in background

Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:

  • A landmark social media addiction trial could reshape how Big Tech designs its platforms.
  • How BrightCanary’s Family Viewing feature helps parents (and co-parents) stay aligned.
  • What do you think about the villain in the newest Toy Story movie?

Digital parenting

⚖️ Is this Big Tech’s Big Tobacco moment? A landmark social media addiction trial is happening right now in Los Angeles. The trial centers on a 20-year-old woman who alleges that endless scrolling and other design features worsened her depression and suicidal thoughts. Snap and TikTok settled before the trial; Meta and YouTube are fighting the claims. Some observers are calling this Big Tech’s Big Tobacco moment — a reference to the tobacco litigation in the ‘90s that exposed internal documents, led to warning labels, and reshaped public health policy. 

Meta CEO Mark Zuckerberg and Instagram chief Adam Mosseri have testified so far. Internal documents shown in court suggest Meta knew minors were using its apps below the age minimum, the company prioritized maximizing time spent scrolling, and safety recommendations from experts were sometimes disregarded. Meta disputes the characterization, arguing the documents are cherry-picked and outdated.

What’s striking is that Meta’s own internal research found that parental supervision tools did not meaningfully curb teens’ compulsive use. Even when parents use the tools the platforms provide, behaviors don’t significantly change — a finding that reinforces something we’ve talked about often: screen time limits and parental controls are not set-it-and-forget-it solutions.

They’re tools. Helpful and necessary ones. But tools alone don’t teach judgment, emotional regulation, or resilience. 

The timing of the trial is especially notable. The day after Adam Mosseri testified that heavy social media use may be “problematic” but not clinically addictive, a new longitudinal study published in Nature found that teens who struggled to describe their feelings or avoid unpleasant emotions were more vulnerable to developing social media addiction over time.

What does it all mean? This trial is ongoing. Researchers and lawmakers around the world are increasingly worried about compulsive use. Hundreds of families and school districts are suing major platforms. And more bellwether cases are coming. If juries consistently find that addictive design harmed minors, the financial and regulatory consequences could be enormous.

For parents, this is a reminder that:

  • Social media platforms are engineered to maximize engagement
  • Parental controls don’t automatically solve design problems
  • Ongoing involvement matters more than app settings

We designed BrightCanary to help parents stay involved and curious in their children’s digital lives. Because technology safety is a skill, not a setting.


Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.


Tech talks

Believe it or not, we’re about halfway through the academic year. This is a great time to zoom out and reset goals — both academic and personal. These conversation-starters help teens connect their daily habits to their bigger ambitions.

  1. “What’s one goal you want to hit before the school year ends?”
  2. “Is there anything online that’s helping you — or distracting you — from that goal?”
  3. “If you had 30 fewer minutes on your phone each day, what would you use that time for?”
  4. “What’s something you’re proud of this year?”
  5. “What would finishing this school year strong look like to you?”

What’s catching our eye

🧍‍♀️ What is the internet like for a 15-year-old girl? In this evocative essay, an anonymous teen describes being inundated with misogyny online. (Language warning.) It’s a sobering reminder that algorithms don’t just show content — they shape culture.

🧸 The villain of Toy Story 5 is … tablets. Pixar’s most nostalgic franchise is confronting “iPad kid” culture head-on. The new trailer shows Woody, Buzz, and the gang competing with iPads for kids’ attention. Art imitates life, after all. What do you think about the trailer?

👾 Discord is rolling out age verification for users. What does it mean, and why is your teen so upset about it? We explain.

boy with hand over face

I’ve written a lot about how social media is detrimental to kids’ mental health. But witnessing the effort some teens in my life put into selfies motivated me to explore the impact these platforms have on young people’s self-esteem in particular. Does the pressure to be perfect online hurt the way they feel about themselves? I discovered the answer is a solid (and, frankly, unsurprising) yes. 

How does social media impact teens’ self-esteem?

Heightened attention to physical appearance and wavering self-esteem is normal for teens, due in part to developing bodies and an increased awareness of social comparison. Here’s how social media has supercharged this:

1. Encourages unhealthy comparisons

Social media prompts unhealthy comparisons in users of all ages. But adolescents' prefrontal cortexes aren’t fully formed, so they process videos and images they see online in a particularly harmful way, literally changing their still-developing brains

2. Exposure to unrealistic beauty standards

Teens are bombarded with curated, heavily edited images online. Research suggests that these unrealistic beauty standards can significantly change their perception of attractiveness, including how they rank themselves in comparison. 

3. In search of the perfect post

It’s not just viewing altered images that’s a problem. Using filters and editing tools to maximize their own physical attractiveness can also lead to lower self-image. This is particularly stark among teens of color due to racial biases in social media beauty filters. Often modeled on white people, filters reinforce racist ideals of attractiveness. 

4. Affects kids of all genders

This conversation often focuses on girls, but boys are also harmed. In one study, nearly every boy reported being exposed to content about appearance such as building muscle and having a certain jawline. Research shows that the more time boys spend on social media, the lower their body satisfaction.

5. Narrow ideas of masculinity and gender roles

Another way young boys are impacted is that they’re frequently fed a narrow idea of what it means to be male. Exposure to content insisting they must build muscle and have lots of money to impress girls is associated with anxiety, feelings of isolation, and low self-esteem in boys. 

6. Not just about looks 

While self-esteem around physical appearance takes a particular hit, it’s not the only area that suffers. Constant comparison with others’ social lives and achievements creates feelings of not measuring up.

Signs that social media is ruining your teen’s self-esteem

Here are some signs that may indicate your teen’s self-esteem is suffering due to social media:

  • Mood swings, especially after scrolling 
  • Crippling fear of failure
  • Excessive comparison to others 
  • Preoccupation with “likes”
  • Difficulty accepting compliments 
  • Ignoring or downplaying their achievements 
  • Blaming themselves when things go wrong 
  • Obsessing over making a post 

Tips: How to help your teen’s self-esteem survive social media

Here’s how to help your teen’s self-esteem survive social media:

1. Reset their algorithm 

Social media algorithms are like echo chambers, amplifying the number of image-focused posts teens are exposed to. In fact, two in three boys report being fed content that promotes stereotypes about masculinity without seeking it. Help your teen periodically reset their algorithm.

2. Encourage real-world relationships

Adolescents with strong offline relationships exhibit higher self-esteem. Encourage your teen to hang with their friends in person. 

3. Teach digital literacy

Help your teen understand the interaction between social media and self-image. Give them opportunities to process those feelings and encourage them to pull back or take a break from social media when it makes them feel bad. 

4. Model a healthy relationship with social media 

Adults aren’t immune to the vicious cycle of social media comparison. But seeing you negatively compare yourself to what you see online sets a harmful example for your child. 

This is an instance where we need to fake it till we feel it, folks. Work out your own social media-induced insecurities with a friend or therapist and keep that business away from your impressionable offspring. 

5. Pay attention 

Overall, there’s a societal acceptance of body dissatisfaction in teens (especially girls). This creates a dangerous environment for teens because their feelings of inadequacy over what they see online are easily overlooked as typical. 

Monitor what your child does online and how it makes them feel, and don’t dismiss your instincts when you suspect something is wrong. 

How BrightCanary can help

BrightCanary helps you keep an eye on social media’s impact on your teen. 

You get: 

  • Monitoring of everything they type across all apps and social media platforms.
  • Real-time alerts when they show signs of problems, including disordered eating and mental health issues.
  • Summaries of their online activity and access to full transcripts. 
  • Emotional insights informed by the American Psychological Association.
  • 24/7, tailored parenting advice with Ask the Canary

In short

Exposure to heavily edited images, unrealistic beauty standards, and unhealthy portrayals of gender roles on social media negatively impact teens’ self-esteem. You can help by keeping an eye on your child’s activity online, resetting their algorithm, teaching them digital literacy, and modeling a healthy relationship with social media. 

BrightCanary helps you monitor your child’s activity on social media by monitoring everything they type across all apps. Download today to get started with a free trial.

Teen boy gaming in front of couch

Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:

  • A new report reveals how gambling is quickly becoming the new normal for boys.
  • Experts warn AI in schools may undermine learning and social development.
  • Screen time limits alone aren’t enough anymore, warns the American Academy of Pediatrics.

Digital parenting

🎰 Gambling is becoming alarmingly common among boys: A new report from Common Sense Media is a wake-up call for parents: 36% of boys ages 11–17 gambled in the past year. And we aren’t talking about slots or poker — the report looked at sports betting apps, loot boxes, skin cases, gacha-style rewards inside video games, and social media feeds that normalize betting. Nearly one in four boys have engaged in gaming-related gambling, and most spent real money doing it. Some stats:

  • 12% of boys bet on sports, including fantasy leagues and small peer bets
  • 12% engaged in traditional gambling, with older teens far more likely
  • 6 in 10 boys see gambling ads on YouTube and social media
  • Gambling is highly social: over 80% of boys gamble if their friends do, compared to under 20% if their friends don’t

While many boys describe gambling as “low-stakes” or just part of bonding with friends or family (one-third have gambled with family members), 27% of boys who gamble report negative effects like stress or conflict. The report also highlights a major loophole: while gambling is illegal for minors, in-game gambling mechanics often aren’t regulated the same way, making it easy for kids to spend (or lose) real money.

What parents can do: Start conversations early, recognize that gambling comes in many forms, set clear rules around spending and games, monitor influences (friends, online activity, and games), and watch for warning signs like secrecy or emotional changes.

🤖 The risks of AI in schools may outweigh the benefits: A new study from the Brookings Institution suggests that while AI tools are being rapidly adopted in classrooms, the risks currently outweigh the benefits — especially for kids’ cognitive and social development. Researchers warn of a “doom loop” where students offload thinking to AI, weakening problem-solving and learning skills over time. There are also concerns about kids developing social and emotional habits through chatbots designed to agree with them, making real-world disagreement and collaboration harder.

UNICEF recommends that parents talk to kids early about what AI is, warn against sharing personal information with AI tools, watch for signs of overuse or behavioral changes, and stay involved in how AI is used for school and beyond. Not sure where to start? Check out our free AI safety toolkit for parents (plus a free code for BrightCanary — send it to another parent!).

📵 Why screen time limits alone aren’t enough anymore: The American Academy of Pediatrics says it’s time to rethink how we manage kids’ screen use. New guidance emphasizes that time limits alone don’t address the real issue: digital platforms are intentionally designed to keep kids engaged through autoplay, notifications, and algorithmic feeds.

Screen time doesn’t tell the whole story anymore. Instead of rigid rules, parents are encouraged to focus on how screens are used, what content kids are engaging with, and how digital life affects sleep, learning, and mental health. Think less stopwatch, more strategy. BrightCanary is designed to help parents stay informed about their child’s activity across all the apps they use — so you know not only what apps your kiddo is using, but also what they encounter. Here’s how to start monitoring (without breaking trust).


Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.


Tech talks

Many kids don’t think they’re gambling … even when they absolutely are. Your goal is to help kids recognize risks before habits form. These conversation-starters can help you open the door without judgment:

  1. “Have you ever spent money in a game for a chance to win something random?”
  2. “What kinds of bets do kids your age joke about or make with friends?”
  3. “Do you see sports betting or gambling content on YouTube or social media?”
  4. “How do games make it feel exciting to spend money — and how do they make money back?”
  5. “What would you do if a game or app started making you feel stressed or pressured?”

What’s catching our eye

📱 TikTok gets an American makeover: TikTok officially has US-based owners. So, the app isn’t going anywhere — but the experience won’t stay the same. Experts say changes will likely show up first in moderation and data practices, not features. If your child uses TikTok, use the Family Pairing feature to set guardrails around their use.

🧹 YouTube takes down major AI slop channels: Following a report showing massive growth in low-quality AI-generated content, YouTube appears to have removed several top “AI slop” channels with millions of subscribers.

🪪 Discord rolls out global age verification: Starting next month, Discord will require face scans or ID for full access. Accounts default to a teen-safe experience unless verified as adult — with stricter filters and protections baked in.

Teen boy stressed in front of chalkboard

According to a survey by the Pew Research Center, 64% of teens report using generative AI chatbots like ChatGPT for everything from homework help to companionship. But a startling concern is emerging among experts. Early research suggests that overreliance on generative AI could lead to cognitive atrophy and the loss of brain plasticity. Or, as the kids say: brain rot

As a parent who is determined to teach my kids how to use AI responsibly, I’ve been watching this issue closely. Here’s what to know about how overusing AI impacts the brain and how to protect your child’s cognitive abilities in the face of this new technology.    

What are the cognitive dangers to kids from overreliance on AI?

Generative AI is in its infancy, and so is the research on this topic. But cognitive offloading is likely to blame for AI’s impact on kids’ cognitive health. 

Cognitive offloading happens when people use external tools or resources to reduce mental effort. On the one hand, this process can help people accomplish tasks faster. On the other hand, all of that offloading can be harmful for developing brains.

1. Using AI hinders skills such as writing and reasoning

Experts suggest cognitive offloading erodes critical thinking and reasoning skills

When AI always provides the answers, kids miss out on the opportunity to develop foundational life skills like problem-solving and deep thinking. 

For example, learning to write is deeply intertwined with learning to think. However, offloading writing tasks degrades students’ ability to organize and express their thoughts.

2. Overuse of AI weakens the way the brain absorbs information

When kids offload tasks to AI without doing any leg work, their ability to perform independent research and analyze materials decreases. Students end up with only a superficial understanding of information — they can state the what, but don’t grasp the why or how

3. Children and teens are the most vulnerable

Research has shown that younger users demonstrate a higher dependence on AI tools when compared to older users, and that the corresponding decline in their critical thinking is also greater.

The brain is particularly malleable during childhood and adolescence, making kids and teens especially vulnerable to the impacts of AI.

Because younger children are more likely to anthropomorphize, or assign human properties to inanimate objects, experts suggest that even simple praise from an AI chatbot can greatly change their behavior.  

How can I help my child use AI in a healthy way? 

The sooner you start teaching your child to use AI smartly, the more you can buffer its effect on their brain. 

1. Help your child build AI literacy 

To help your child gain AI literacy, teach them:

  • How AI tools work. Here’s a great primer for kids
  • AI can be wrong. From hallucinations to faulty data to fraud, AI doesn’t always get the facts straight. 
  • AI contains bias. AI is trained on data from humans, and humans are inherently biased. Therefore, so is AI. 
  • How overuse of AI can impact their brain. Ask open-ended questions like, "AI can give quick answers, but what do you think happens to our brains when we don't have to work hard to solve things?"

2. Teach your child to use AI as a tool, not a crutch

AI isn’t inherently harmful. The key is using it to support thinking, not replace it. Encourage your child to:

  • Generate their own ideas. 
  • Limit their use of AI, and explain that moderation is key.
  • Use AI as a starting point for research, but independently verify facts.
  • Write first drafts themselves to gain the cognitive benefits of organizing and expressing their thoughts.
  • Focus on using AI to improve productivity rather than offloading thinking.
  • Think critically about the material produced by AI.

3. Model a balanced approach to AI

  • Examine your own use (or overuse) of AI.
  • Openly question the information you encounter on AI.
  • Maintain a curiosity mindset and let your kids see you engaging in activities and pursuits without the use of AI.

How BrightCanary helps you monitor your child’s AI use

BrightCanary helps you monitor how your child engages with AI by scanning everything they type on their iPhone or iPad. Use it to: 

  • Monitor their activity across every app. 
  • Access summaries of their activity.
  • Read full transcripts when you need more details.
  • Get real-time alerts if your child types anything concerning on an AI platform (or any other app). 

In short 

Overreliance on generative AI may lead to a decline in cognitive skills such as critical thinking, reasoning, and the ability to analyze and understand information. Because their brains are especially malleable, children and teens are particularly vulnerable to the impacts of AI on the brain. It’s important to teach your child AI literacy, show them how to use the tool responsibly, and monitor how they use it. 

BrightCanary helps you monitor your child’s activity on the apps they use the most, including all AI platforms. Download today to get started for free.

Children looking at tablet

It will come as no surprise to parents that YouTube is all the rage with kids. In fact, recent research suggests that nine out of 10 kids use YouTube, and kids under 12 favor YouTube over TikTok. With all of YouTube’s popularity, how can you make the platform safer for your child? Read on to learn how to set parental controls on YouTube. 

Why parental controls matter

As the name implies, YouTube is a platform for user-generated content. While this creates an environment ripe for creativity, it also means there’s a little bit of everything, including videos featuring violent and sexual content, profanity, and hate speech. 

Because YouTube makes it easy for kids to watch multiple videos in a row, there’s always the chance your child may accidentally land on inappropriate content. In addition, the comments section on YouTube videos are often unmoderated and can be full of toxic messages and cyberbullying. 

Due to the risks, it’s important that parents monitor their child’s YouTube usage, discuss the risks with them, and use parental controls to minimize the chance they’re exposed to harmful content. 

How to set parental controls on YouTube

YouTube offers a variety of options for families looking to make their child’s viewing experience as safe as possible. Here are some important steps parents can take: 

Create a supervised Google account for YouTube

A supervised account will allow you to manage your child’s YouTube experience on the app, website, smart TVs, and gaming consoles. 

Select a content setting

There are three content setting options to choose from: 

  • Explore: Content rated for viewers 9+. This category also excludes live streams, with the exception of Premieres
  • Explore more: For viewers 13+. This setting includes a larger set of videos, including live streams. 
  • Most of YouTube: For viewers 13-17. This option has almost everything on YouTube, but excludes content marked as 18+ by either channels or YouTube’s systems or reviewers. 

Set parental controls

Along with content settings, here are some additional YouTube parental controls to explore: 

  • Block specific channels: When monitoring your child's YouTube usage, if you encounter content you prefer they avoid, you have the option to block that channel. 
  • Review your child’s watch history: When you can't supervise their viewing at the moment, you can check what your child has been watching.  
  • Control video suggestions: If you don’t like the videos YouTube’s algorithm is suggesting for your child, try these steps to reset their YouTube algorithm:
    • Clear history
    • Pause watch history 
    • Pause search history
  • Disable Autoplay: This setting prevents YouTube from automatically playing the next suggested video.
  • Set time limits: If you need a little help enforcing screen time limits, this option shuts down the app when your child reaches their max. 

Parents will also be able to set specific restrictions on YouTube Shorts, the platform's short-form video experience similar to TikTok. Soon, parents can set time limits on Shorts, as well as custom reminders for bedtime and taking screen time breaks. As of this writing, this feature isn't yet available.

For step-by-step instructions for setting up parental controls, refer to this comprehensive guide by YouTube. 

Where parental controls on YouTube fall short

While YouTube offers an impressive array of parental control settings, you have to manually review your child’s content and watch history in order to catch any concerning content. 

BrightCanary is a parental monitoring app that fills in the gaps. Here’s how BrightCanary helps you supervise your child’s YouTube activity:

  • The app reports on what your child is watching and searching for, so you don’t have to watch each video on your own.
  • Advanced technology automatically scans your child’s video activity and flags anything concerning, so you’ll know when you need to step in.
  • You can either view all of their YouTube activity, or just review any videos flagged as concerning.
  • You can also monitor Google activity, texts, social media, and more — more coverage than other parental control apps on Apple devices.

YouTube vs. YouTube Kids

For parents looking for additional peace of mind, YouTube Kids provides curated content designed for children from preschool through age 12. 

For households with multiple children, parents can set up an individual profile for each child, so kids can log in and watch videos geared toward their age. YouTube Kids also allows parents to set a timer of up to one hour, limiting how long a child can use the app. 

Parents should be aware that switching to YouTube Kids isn’t a perfect solution. There’s still a chance that inappropriate content may slip through the filters. 

In fact, a study by Common Sense Media found that 27% of videos watched by kids 8 and under are intended for older audiences. And for families concerned about ads, YouTube Kids still has plenty of those — targeted specifically toward younger children. Keeping an eye on what your child is watching and talking to them about inappropriate videos and sponsored content is still a good idea, even with YouTube Kids. Fortunately, you can also monitor YouTube Kids with BrightCanary.

It’s also worth noting that kids under 12 who have a special interest they want to pursue may find YouTube Kids limiting. A child looking to watch Minecraft instructional videos or do a deep dive into space exploration, for example, can find a lot more options on standard YouTube — plenty of which are perfectly appropriate for kids, even if they aren’t specifically geared toward them. It’s cases like this where parental controls and active monitoring are especially useful. 

The takeaway

YouTube is a popular video platform with plenty to offer kids. It’s not without risks, though. Parents should monitor their child’s use and take advantage of parental controls to ensure a safe, appropriate viewing experience. 

Instagram logo iconFacebook logo icontiktok logo iconYouTube logo iconLinkedIn logo icon
Be the most informed parent in the room.
Sign up for bimonthly digital parenting updates.
@2024 Tacita, Inc. All Rights Reserved.