How to Reset Your Child’s Social Media Algorithm

By Andrea Nelson
October 19, 2023
Three kids using smartphones

As a parent, you want your child to surround themselves with good influences. That’s true not only for who they spend time with in real life, but also for the people and ideas they’re exposed to on social media. 

If you or your child are concerned about the content appearing in their feed, one beneficial step you can take is to help them reset their social media algorithm. Here’s how to reset your child’s algorithm on TikTok, Instagram, and other platforms.

What is a social media algorithm?

Social media algorithms are the complex computations that operate behind the scenes of every social media platform to determine what each user sees. 

Everything on your child’s social media feed is likely the result of something they liked, commented on, or shared. (For a more comprehensive explanation, check out our Parent’s Guide to Social Media Algorithms.)

Social media algorithms have a snowball effect. For example, if your child “likes” a cute dog video, they’ll likely see more of that type of content. However, if they search for topics like violence, adult material, or conspiracy theories, their feed can quickly be overwhelmed with negative content.

Therefore, it’s vital that parents actively examine and reset their child’s algorithm when needed, and also teach them the skills to evaluate it for themselves. 

Research clearly demonstrates the potentially negative impacts of social media on tweens and teens. How it affects your child depends a lot on what’s in their feed. And what’s in their feed has everything to do with algorithms. 

Talking to your child about their algorithm

Helping your child reset their algorithm is a wonderful opportunity to teach them digital literacy. Explain to them why it’s important to think critically about what they see on social media, and what they do on the site influences the content they’re shown. 

Here are some steps you can take together to clean up their feed: 

Start with their favorite app

Resetting all of your child’s algorithms in one fell swoop can be daunting. Instead, pick the app they use the most and tackle that first. 

Scroll through with them

If your kiddo follows a lot of accounts, you might need to break this step into multiple sessions. Pause on each account they follow and have them consider these questions:

  • Do this person’s posts usually make me feel unhappy or bad about myself? 
  • Does this account make me feel like I need to change who I am? 
  • Do I compare my life, body, or success with others when I view this account? 

If the answer “yes” to any of these questions, suggest they unfollow the account. If they’re hesitant — for example, if they’re worried unfollowing might cause friend problems — they can instead “hide” or “mute” the account so they don’t see those posts in their feed. 

Encourage interaction with positive accounts 

On the flip side, encourage your child to interact with accounts that make them feel good about themselves and portray positive messages. Liking, commenting, and sharing content that lifts them up will have a ripple effect on the rest of their feed. 

Dig into the settings 

After you’ve gone through their feed, show your child how to examine their settings. This mostly influences sponsored content, but considering the problematic history of advertisers marketing to children on social media, it’s wise to take a look.  

Every social media app has slightly different options for how much control users have over their algorithm. Here's what you should know about resetting the algorithm on popular apps your child might use.

How to reset Instagram algorithm

  • Go to Settings > Ads > Ad topics. You can view a list of all the categories advertisers can use to reach your child. Tap “See less” for ads you don’t want to see. 
  • Go to your child’s profile > tap Following > scroll through the categories to view (and unfollow) the accounts that appear most in your child’s feed.
  • Tap the Explore tab in the bottom navigation bar and encourage your child to search for new content that matches their interests, like cooking, animals, or TV shows.

How to reset TikTok algorithm

  • Go to Settings > Content Preferences > Refresh your For You feed. This is like a factory reset of your child’s TikTok algorithm.
  • Go to Settings > Free up space. Select “Clear” next to Cache. This will remove any saved data that could influence your child’s feed.
  • As your child uses TikTok, point out the “Not Interested” feature. Tap and hold a video to pull up this button. Tapping “Not interested” tells TikTok’s algorithm not to show your child videos they don’t like. 

How to reset YouTube algorithm

  • Go to Library > View All. Scroll back through everything your child has watched. You can manually remove any videos that your child doesn’t want associated with their algorithm — just then tap the three dots on the right side, then select Remove from watch history.
  • Go to Settings > History & Privacy. Tap “Clear watch history” for a full reset of your child’s YouTube algorithm.

What to watch for

To get the best buy-in and help your child form positive long-term content consumption habits, it’s best to let them take the lead in deciding what accounts and content they want to see. 

At the same time, kids shouldn't have to navigate the internet on their own. Social platforms can easily suggest content and profiles that your child isn't ready to see. A social media monitoring app, such as BrightCanary, can alert you if your child encounters something concerning.

Here are a few warning signs you should watch out for as you review your child's feed: 

If you spot any of this content, it’s time for a longer conversation to assess your child’s safety. You may decide it’s appropriate to insist they unfollow a particular account. And if what you see on your child’s feed makes you concerned for their mental health or worried they may harm themselves or others, consider reaching out to a professional.  

In short 

Algorithms are the force that drives everything your child sees on social media and can quickly cause their feed to be overtaken by negative content. Regularly reviewing your child’s feed with them and teaching them skills to control their algorithm will help keep their feed positive and minimize some of the negative impacts of social media. 

Woman smiling at phone while sitting on couch

Just by existing as a person in 2023, you’ve probably heard of social media algorithms. But what are algorithms? How do social media algorithms work? And why should parents care? 

At BrightCanary, we’re all about giving parents the tools and information they need to take a proactive role in their children’s digital life. So, we’ve created this guide to help you understand what social media algorithms are, how they impact your child, and what you can do about it. 

What is a social media algorithm? 

Social media algorithms are complex sets of rules and calculations used by platforms to prioritize the content that users see in their feeds. Each social network uses different algorithms. The algorithm on TikTok is different from the one on YouTube. 

In short, algorithms dictate what you see when you use social media and in what order. 

Why do social media sites use algorithms?

Back in the Wild Wild West days of social media, you would see all of the posts from everyone you were friends with or followed, presented in chronological order. 

But as more users flocked to social media and the amount of content ballooned, platforms started introducing algorithms to filter through the piles of content and deliver relevant and interesting content to keep their users engaged. The goal is to get users hooked and keep them coming back for more.  

Algorithms are also hugely beneficial for generating advertising revenue for platforms because they help target sponsored content. 

How do algorithms work? 

Each platform uses its own mix of factors, but here are some examples of what influences social media algorithms:

Friends/who you follow 

Most social media sites heavily prioritize showing users content from people they’re connected with on the platform. 

TikTok is unique because it emphasizes showing users new content based on their interests, which means you typically won’t see posts from people you follow on your TikTok feed. 

Your activity on the site

With the exception of TikTok, if you interact frequently with a particular user, you’re more likely to see their content in your feed. 

The algorithms on TikTok, Instagram Reels, and Instagram Explore prioritize showing you new content based on the type of posts and videos you engage with. For example, the more cute cat videos you watch, the more cute cat videos you’ll be shown. 

YouTube looks at the creators you interact with, your watch history, and the type of content you view to determine suggested videos. 

The popularity of a post or video 

The more likes, shares, and comments a post gets, the more likely it is to be shown to other users. This momentum is the snowball effect that causes posts to go viral. 

Why should parents care about algorithms? 

There are ways social media algorithms can benefit your child, such as creating a personalized experience and helping them discover new things related to their interests. But the drawbacks are also notable — and potentially concerning. 

Since social media algorithms show users more of what they seem to like, your child's feed might quickly become overwhelmed with negative content. Clicking a post out of curiosity or naivety, such as one promoting a conspiracy theory, can inadvertently expose your child to more such content. What may begin as innocent exploration could gradually influence their beliefs.

Experts frequently cite “thinspo” (short for “thinspiration”), a social media topic that aims to promote unhealthy body goals and disordered eating habits, as another algorithmic concern.

Even though most platforms ban content encouraging eating disorders, users often bypass filters using creative hashtags and abbreviations. If your child clicks on a thinspo post, they may continue to be served content that promotes eating disorders

Social media algorithm tips for parents

Although social media algorithms are something to monitor, the good news is that parents can help minimize the negative impacts on their child. 

Here are some tips:

Keep watch

It’s a good idea to monitor what the algorithm is showing your child so you can spot any concerning trends. Regularly sit down with them to look at their feed together. 

You can also use a parental monitoring service to alert you if your child consumes alarming content. BrightCanary is an app that continuously monitors your child’s social media activity and flags any concerning content, such as photos that promote self-harm or violent videos — so you can step in and talk about it.

Stay in the know

Keep up on concerning social media trends, such as popular conspiracy theories and internet challenges, so you can spot warning signs in your child’s feed. 

Communication is key

Talk to your child about who they follow and how those accounts make them feel. Encourage them to think critically about the content they consume and to disengage if something makes them feel bad. 

In short

Algorithms influence what content your child sees when they use social media. Parents need to be aware of the potentially harmful impacts this can have on their child and take an active role in combating the negative effects. 

Stay in the know about the latest digital parenting news and trends by subscribing to our weekly newsletter

teen girl using laptop in bedroom

Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:

  • 55% of students use social media to self-diagnose mental health conditions. We break down what this means and how parents can talk to their kids about what’s on their feed. 
  • As of Jan. 1, 2025, kids under 14 are banned from having social media accounts in Florida — assuming the bill isn't held up in court.
  • This week in Tech Talks: conversation-starters to check in with your child about mental health, checking their sources, and more.

Digital Parenting

Kids are using social media to self-diagnose

If your teen suddenly has a new lexicon of mental health terms, like “trauma response” and “major depressive disorder,” TikTok may be to blame. A poll by EdWeek found that 55% of students use social media to self-diagnose mental health conditions, and 65% of teachers say they’ve seen the phenomenon in their classrooms. 

“Kids are all coming in and I’m asking them, ‘Where did you get this diagnosis?’” said Don Grant, national adviser for healthy device management at Newport Healthcare, in an interview with The Hill. Grant said he would get responses such as “Oh, there’s an [influencer],” “Oh, I took a quiz,” or “Oh, there’s a group on social media that talks about it.”  

Social media can help kids understand their feelings and find ways to cope. The EdWeek poll found that 72% of educators believe social media has made it easier for students to be more open about their mental health struggles. And it makes sense that kids would turn to a space they know — social media and online groups — to get information, rather than finding a mental health professional first (or talking to their parents). 

However, the topic gets tricky when you consider the fact that social media sites don’t exactly verify that the people sharing medical advice are, in fact, medical experts. While there are plenty of experts sharing legitimate information online, there are also influencers who are paid to talk about products that improved their anxiety and off-label medications that cured their depression. 

Big picture: Self-diagnosing on social media is also problematic because algorithms can create a self-fulfilling prophecy. Most algorithms, like TikTok, use a user’s activity to determine what they see next on their feed. If a teen thinks they have depression, they’ll see more content about depression — which may confirm their self-diagnosis, even if they aren’t clinically depressed.

As parents, it’s important to talk to your child about mental health, how to cope with big emotions, and what to do if they need a professional. But it’s also essential to know where they’re getting their mental health information and what they’re seeing on their social media feeds. 

Don’t dismiss their feelings outright — be curious. Talk to your child about verifying their sources of information. If they’re getting medical advice from an online creator, are they an actual doctor or therapist? Or are they simply someone who’s popular online?

Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.

Florida passes one of the most restrictive social media bans for minors

Gov. Ron DeSantis recently signed a bill that bans kids under 14 from creating social media accounts and requires parental consent for kids under 16. The bill requires that companies delete accounts belonging to 14- and 15-year-olds and implement age verification measures to ensure that kids aren’t lying about their ages. 

Florida’s bill is the most restrictive social media ban in the nation, and that’s after DeSantis vetoed an earlier version of the bill that would have banned all accounts for kids under 16. At the bill-signing ceremony, Republican Speaker Paul Renner said, “A child in their brain development doesn’t have the ability to know that they’re being sucked into these addictive technologies and to see the harm and step away from it, and because of that we have to step in for them.”

Legal upheaval: The bill takes effect Jan. 1, 2025, pending any legal challenges. Tech industry groups have already come out against the bill, including NetChoice, an association that represents major social media platforms and is currently battling with the Supreme Court over a separate social media law. 

“This bill goes too far in taking away parents’ rights,” Democratic Rep. Anna Eskamani said in a news release. “Instead of banning social media access, it would be better to ensure improved parental oversight tools, improved access to data to stop bad actors, alongside major investments in Florida’s mental health systems and programs.”

In our last issue, we covered Utah’s decision to repeal and replace its social media law after months of legal challenges that delayed the bill’s implementation. Although DeSantis and Renner have signaled that they’re ready to fight to keep Florida’s social media ban in place, time will tell whether or not Florida’s kids will have to wait until their sweet 16 to get on Snapchat. 

Tech Talks With Your Child

How will you check in with your child about online safety this week? Save these conversation-starters for your next check-in. 

  1. "Have you ever come across anything online that made you feel uncomfortable or worried?”
  2. "Do you know how to check if information you find online is true or reliable? Let's talk about how to evaluate sources together."
  3. "How do you feel after spending time on social media? Does it ever affect your mood or feelings about yourself?"
  4. "What would you do if you received a message or saw a post that talked about depression or anxiety? Do you know who to talk to?"
  5. “What are some ways you like to spend time with your friends offline? Can we plan any upcoming events or get-togethers?”

Practical Parenting Tips

How does screen time affect sleep?

Sleep can impact everything from brain performance, to mood, to mental and physical health. Our children aren’t getting enough sleep, either, and screens are one of the prime suspects. But how does screen time affect sleep?

A parent’s guide to Pinterest parental controls

Pinterest use is up among teens. Gen Zers are using the website as a canvas for self-expression and exploration. Learn more about how to keep your child safe on the site with Pinterest parental controls.  

What’s Catching Our Eye

😮‍💨 What is the “mental load” of parenting, and how does it affect your emotions, sleep quality, and job performance?

🚩 What are the red flags that you need to worry about your child’s mental health? Save this list from Techno Sapiens.

🤝 Rules and restrictions aren’t the end-all, be-all to parenting in the digital age — you also need a healthy, emotionally rich relationship with your teen. Read more at Psychology Today.

📵 When it comes to protecting kids’ mental health, Florida’s social media ban won’t be that simple, writes David French for the New York Times

close up of pinterest grid

In a surprising resurgence of the platform’s cool factor, Pinterest use is up among teens. Gen Zers are using the website as a canvas for self-expression and exploration. Read on to learn more about how to keep your child safe on the site with Pinterest parental controls.  

What is Pinterest? 

Pinterest describes itself as “a visual discovery engine for finding ideas.” Users save “Pins” of images or videos to virtual boards. They can record live videos and take photos right in the app, or save images found elsewhere on the internet as Pins. 

How your child might use Pinterest

Many kids come to Pinterest to find inspiration and share ideas around a hobby or interest. Teens are more likely than their adult counterparts to create Pins of things they’ve made and their outfits. Kids also use it to connect with others around common interests, such as books, beauty, or fashion.

How your child might interact with others on Pinterest

Pinterest allows users to interact with each other through comments, direct messages, and shared boards. Although Pinterest may seem relatively tame in comparison to TikTok or Snapchat, parents should take the same precautions as they do with other social media sites.   

Here are some ways people might interact with your child on Pinterest: 

  • Group boards: Boards can be secret or public. Secret boards become group boards when users are invited as collaborators. 
  • Followers: Anyone can follow a public account. Accounts set to private aren’t discoverable, but users can invite people to follow their private account. 
  • Reactions and comments: Users can react to and comment on Pins. 
  • Direct message: Users can exchange private messages with one another. 
  • Mentions: People can use the @ symbol to mention other users in Pin comments and descriptions, which notifies the person mentioned.
  • Sharing: Pins can be shared on other social media networks, sent to users on Pinterest, and shared via emailed to people not on Pinterest.

Risk of letting your child using Pinterest 

Just like any social media site, there are risks parents need to be aware of. In 2023, NBC News reported that adult men were using Pinterest to create boards with pictures of young girls and teenagers. The platform responded by rolling out a suite of new Pinterest parental controls, which we’ll discuss below.

Aside from online predators, Pinterest can also expose your child to content that promotes negative body image, negative self-esteem, and even suicidal thoughts. Like other websites, Pinterest uses an algorithm to recommend content based on what your child searches and the pins they click. Research shows that excessive social media use can make kids feel bad about themselves, so it’s important to talk to your child about the content on their feed and limit the time they spend on social media — including Pinterest.

Exposure to inappropriate content is also a risk on Pinterest. Pins can lead kids to websites with explicit content, misinformation, and just plain spam, solely because they clicked a pin that caught their attention. 

Benefits of letting your child using Pinterest 

There are also plenty of positive reasons to let your child use Pinterest, with guardrails. 

For example, Pinterest can be a great source of inspiration, creative expression, and connection because users have the ability to dive deeper into their interests. Plus, Pinterest is full of tutorials that can help kids learn new skills, like cooking and coding. 

Pinterest can even foster a boost of positivity. Recent research from Pinterest and University of California, Berkeley, found that daily interaction with inspiring content on Pinterest helped buffer students against things like burnout and stress. 

How to use Pinterest parental controls

The good news is that Pinterest parental controls are fairly robust. The company recently took steps to protect minors on their site, including age verification, automatically setting accounts to private for users under 16, and additional reporting options. The minimum age for Pinterest users is 13.

There are also extra steps you can take to keep your child safe on Pinterest: 

  • Verify their age: Confirm they entered their age correctly when they signed up for an account to ensure the teen safety settings are in place. 
  • Monitor their account: Follow your child on Pinterest and sit down with them periodically to view their feed together.  
  • Set up a parental passcode: This code locks certain privacy, data, and social permissions settings. 
  • Help them set their privacy: Check that their account is set to private and show them how to adjust their settings to control who can view their content. They can also edit their profile to control what information is displayed.  
  • Encourage them to use secret boards: Secret boards can only be viewed by your child and people they invite. They should only invite people they know in real life.
  • Talk with them about safety: Encourage your child to only share content with people they know and trust. Discuss the risks of allowing people they don’t know access to their boards and remind them to be cautious about what they share. 
  • Establish open communication: Be upfront about the risks your child may face on Pinterest. Make it clear they can come to you if they have a problem, and you’ll help them through it. 
  • Report and block: Show your child how to report inappropriate Pins, users, and messages. Make sure they also know they can block users who make them uncomfortable.

The bottom line

While Pinterest can be a positive creative outlet for kids, it’s not without risk. Parents should educate themselves about the potential dangers and take steps to keep their child safe on the site. 

Friend recording a girl on smartphone

Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:

  • Are the days of lip synching to trending songs coming to an end? We break down the proposed TikTok ban heading to a Senate vote.
  • Speaking of social media, have you changed your child’s privacy settings on Instagram? We share tips to make your child’s Instagram account safer.
  • 44% of teens say they feel anxious without their smartphones, according to a new Pew Research Center survey.

Digital Parenting

Inside the proposed TikTok ban

Today, the House overwhelmingly voted to pass a bill that would effectively ban TikTok in the United States. The bill now heads to the Senate, where its future is less certain. The measure, H.R. 7521, would ban applications controlled by foreign adversaries of the United States that pose a clear national security risk. 

For years, US officials have dubbed TikTok a national security threat. China’s intelligence laws could enable Beijing to snoop on the user information TikTok collects. Although the US government has not publicly presented evidence that the Chinese government has accessed TikTok user data, the House vote was preceded by a classified briefing on national security concerns about TikTok's Chinese ownership.

If H.R. 7521 is passed, ByteDance will have 165 days to sell TikTok. Failure to do so would make it illegal for TikTok to be available for download in U.S. app stores. On the day of the vote, TikTok responded with a full-screen pop-up that prompted users to dial their members of Congress and express their opposition to the bill. In a post on X, TikTok shared: “This will damage millions of businesses, deny artists an audience, and destroy the livelihoods of countless creators across the country.”

"It is not a ban,” said Representative Mike Gallagher, the Republican chairman of the House select China committee. “Think of this as a surgery designed to remove the tumor and thereby save the patient in the process."

The bottom line: The bill passed the House Energy and Commerce Committee unanimously, which means legislators from both parties supported the bill. Reuters calls this the “most significant momentum for a U.S. crackdown on TikTok … since then President Donald Trump unsuccessfully tried to ban the app in 2020.” The TikTok legislation's fate is less certain in the Senate. If the bill clears Congress, though, President Biden has already indicated that he would sign it.

If your child uses TikTok, it’s natural that they may have questions about the ban (especially if they dream of becoming a TikTok influencer). Nothing is set in stone, and it’s entirely possible that TikTok would simply change ownership. However, this is a good opportunity to chat with your kids about the following talking points:

  • It’s true that social media can be entertaining and educational. 
  • But social media companies can buy and sell your data, use algorithms to change your opinions about topics, and design their apps to make you spend more time using them.
  • We elect representatives to represent us. That’s why it’s important to vote, stay informed about current events, and think critically about the information you consume.

Practical Parenting Tips

Is Instagram safe for kids? A parent’s guide to safety recommendations

Set your child’s account to private, limit who can message them, and limit reposts and mentions. With a few simple steps, you can make Instagram a safer place for your kid. Here’s how to get it done.

How to talk to your child about sending inappropriate text messages

Yikes — you found out that your child has been sending concerning videos, images, or messages to someone else. We break down some of the reasons kids send inappropriate messages and how to approach them.

What’s Catching Our Eye

🏛️ An update on Florida’s social media ban: as expected, Governor Ron DeSantis vetoed a bill that would have banned minors from using social media, but signaled that he would sign a different version anticipated from the Florida legislature.

📵 Nearly three-quarters (72%) of U.S. teens say they feel happy or peaceful when they don’t have their smartphones — but 44% say they feel anxious without them, according to Pew Research Center.

📖 Do digital books count as screen time? The benefits of reading outweigh screen time exposure, according to experts.

🗺️ How can parents navigate the challenges of technology and social media? Set limits, help your child realize how much time they spend on tech, and model self-restraint. Check out these tips and more via Psychology Today.

Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.

Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:

  • Florida may ban social media for minors. The bill, HB1, is currently awaiting Governor DeSantis’ veto or approval.
  • The Kids Online Safety Act just hit a big milestone: it officially has enough supporters to pass the Senate, although the bill hasn’t yet moved to a vote.
  • Meta announced the expansion of a program to help teens avoid sextortion scams on Facebook and Instagram.

Digital Parenting

New milestone for online safety legislation

A Florida bill that bans minors from using social media recently passed the House and Senate. The bill, HB1, is now on Governor Ron DeSantis’ desk. He’ll have until March 1 to veto the legislation or sign it into law. 

DeSantis has previously said that he didn’t support the bill in its current form, which bars anyone younger than 16 years old from creating new social media accounts — and closes existing accounts for kids 16 and younger. (DeSantis has called social media a “net negative” for young people, but said that, with parental supervision, it could have beneficial effects.) Unlike online safety bills passed in other states, HB1 doesn’t allow minors to use social media with parental permission: if you’re a minor, you can’t have an Instagram.

Even if DeSantis vetoes the bill, the fact that such an aggressive bill passed both the House and Senate with bipartisan support signals that the conversation about online safety legislation is reaching a tipping point. 

The Kids Online Safety Act (KOSA), which implements social media regulations at the federal level, also recently reached a major milestone: an amended version gained enough supporters to pass the Senate. If it moves to a vote, it would be the first child safety bill to get this far in 25 years, since the Children's Online Privacy Protection Act passed in 1998.

If passed, KOSA would make tech platforms responsible (aka have a “duty of care”) for preventing and mitigating harm to minors on topics ranging from mental health disorders and online bullying to eating disorders and sexual exploitation. Users would also be allowed to opt-out of addictive design features, such as algorithm-based recommendations, infinite scrolling, and notifications. 

In a previous iteration of KOSA, state attorneys general were able to enforce the duty of care. However, some LGBTQ+ groups were concerned that Republican AGs would use the law to take action against resources for LGBTQ+ youth. The amended version leaves enforcement to the Federal Trade Commission — a move that led a number of advocacy groups, including GLAAD, Human Rights campaign, and The Trevor Project — to state they wouldn’t oppose the new version of KOSA if it moves forward. (So, not an endorsement, but not-not an endorsement.)

What’s next? As of this publication, DeSantis has not signed or vetoed Florida’s social media ban. Plus, KOSA has yet to be introduced to the Senate for a vote, and it’s flying solo — there is no companion bill in the House, which would give the House and Senate time to consider a measure simultaneously. 

However, the fallout from January’s Senate Judiciary Committee — in which lawmakers grilled tech CEOs about their alleged failure to stamp out child abuse material on their platforms — may build momentum for future online safety legislation. We’ll keep our eyes peeled.

Practical Parenting Tips

How to use Spotify parental controls

Spotify offers everything from podcasts to audiobooks — and with all of that media comes content concerns. The good news: both Spotify Kids and Spotify parental controls allow kids to enjoy their tunes while keeping their ears clean.

Is One Piece for kids?

If you remember watching the pirate-themed anime series One Piece, you might be excited about the recently released live-action remake now streaming on Netflix and eager to share your love of the show with your kids. But is One Piece for kids?

What’s Catching Our Eye

🔒 Did you know that 90% of caregivers use at least one parental control? That’s according to a new survey from Microsoft.

📱 Social media is associated with a negative impact on youth mental health — but a lot of the research we have tends to focus on adults. In order to really understand cause and effect, researchers need to talk to teens about how they use their phones and social networks. Read more via Science News.

🛑 Meta announced the expansion of the Take It Down program, which is “designed to help teens take back control of their intimate images and help prevent people — whether it’s scammers, ex-partners, or anyone else — from spreading them online.”

Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.

Child listening to music

At first glance, the idea of setting Spotify parental controls might seem surprising. After all, isn’t Spotify just a music streaming platform? In reality, Spotify offers everything from podcasts to audiobooks — and with all of that media comes content concerns. 

Maybe you’ve heard of reports of pornography showing up on Spotify, or maybe you would rather your kids don’t repeat the f-bomb they picked up while belting along to the latest Olivia Rodrigo track. Whatever your motivation for being leery of giving your child free reign on Spotify, you don’t need to give up on the platform altogether. Both Spotify Kids and Spotify parental controls allow kids to enjoy their tunes while keeping their ears clean.

What is Spotify Kids? 

Spotify Kids is an ad-free service available exclusively with a Spotify Premium Family subscription. Designed for kids aged 12 and under, it features kid-friendly content specifically curated for the youngest listeners. Spotify Kids features music, audiobooks, and more, and allows parents to view and manage the content their child listens to. 

Is Spotify Kids safe? 

Not only does Spotify Kids not contain any content marked Explicit, but it’s also curated by humans, so you don’t have to worry about something sneaking past an algorithm or a filter. 

What age is Spotify Kids for? 

Spotify Kids is specifically designed for kids 12 and under. (Spotify’s terms require users of regular accounts to be at least 13.)

How much does Spotify Kids cost? 

In order to use Spotify Kids, you must have a Spotify Premium Family subscription. Spotify Premium Family is a discounted plan available for up to six family members. It costs $16.99/month, and you can cancel at any time.

How to set up Spotify Kids

Getting started on Spotify Kids is a breeze. Just follow these easy steps: 

  1. Subscribe to Spotify Family Premium
  2. Download the Spotify Kids app to your child’s iOS or Android device.
  3. Follow the prompts to set up a PIN for accessing the settings within the app.
  4. Create an avatar for your child. (They’ll have fun helping you pick!)
  5. Select the appropriate age category for your child: under 6 or 5-12.

Spotify parental controls 

If you have kids over 13, don’t worry — there are still parental control options to keep their listening experience appropriate. To use Spotify parental controls, you must have a Spotify Family account.

Here’s how to set up Spotify parental controls: 

On a mobile device

  1. Open the Spotify website on your mobile browser. 
  2. Tap the Menu icon (≡) in the top right corner. 
  3. Tap Log in
  4. Enter your account details.
  5. Tap Account Overview.
  6. Select Premium Family from the dropdown menu.
  7. In the People on this plan section, tap on the name of the family member whose account you want to manage.
  8. Tap Allow explicit content to toggle it to the off position. 

On a desktop

  1. Open the Spotify website.
  2. Log in to your account. 
  3. Click on the Premium Family tab on the left hand side. 
  4. In the People on this plan section, select the name of the family member whose account you want to manage.
  5. Click on Allow explicit content to toggle it to the off position. 

Spotify on shared family devices  

If your child uses Spotify on a shared family device, and you don’t want to restrict content for that device, be aware that they may come across material that isn’t appropriate for their age. Each family needs to weigh the pros and cons of restricting content and make the choice that’s right for their household. 

If you allow your child to use Spotify on an unrestricted account, it’s a good idea to have a discussion with them about questionable content they might encounter and monitor their use by keeping an ear out or peeking at the listening history. 

In short 

Both Spotify Kids and Spotify parental controls offer families options to keep their kid’s listening experience age-appropriate. If your child listens to music on other platforms, such as YouTube, make sure you use parental control settings on those websites and apps, too. With BrightCanary, you can monitor YouTube activity directly from your phone. 

While it’s good to let your child develop their own interests (and playlists), a little bit of supervision goes a long way in keeping your child from content they’re not old enough to handle on their own. 

Father and son talking on couch

Since the early days of the internet, parents have worried what their children are up to online, and companies have responded with parental controls to help keep kids safe. But the way we use the internet has changed dramatically since its inception. This shift has ushered in the need for new approaches to parental controls. Read on to learn how we got here and to explore the best parental controls and monitoring apps to protect kids online.

Types of parental controls 

There are four basic categories of parental controls, ranging from settings on your child’s devices to third-party software. 

Content filters

These controls filter out inappropriate content, thereby limiting what your child can access. In the early days of the internet, the only way to filter content was to install third-party software, such as Net Nanny. Now, the option to filter content is built right into search engines. 

Usage controls 

Usage controls include things like screen time limits and blocking access to certain types of apps, such as social media or gaming. Apple Screen Time is a prime example: this free service allows you to prevent your child from making purchases on the App Store without your permission, schedule quiet time for notifications, and more.

Computer user management 

User management tools are software that set different levels of access, depending on who’s using the device. If you log in to your family laptop, you’ll have unrestricted access, while your child’s profile will include limitations. Most computers now have this feature built-in. 

Monitoring tools

Monitoring tools do exactly what the name suggests: monitor your child’s activity online. What they monitor varies widely depending on the tool. For example, Apple’s Find My monitors your child’s location, while an app like BrightCanary monitors your child’s social media, text messages, and Google and YouTube activity.

The early days of parental controls 

Back in the Wild, Wild West of the World Wide Web, the options for parental controls were limited to web filters. In 1994, Net Nanny introduced a browser that filtered web and chat room content, blocked images, and masked profanity. 

While it was revolutionary at the time, these were still the days where using the internet meant sitting at a desktop computer — typically on a shared family device — with the unmistakable pings of the dial-up modem announcing anytime someone was online. 

Since then, a lot has changed about how we use technology. Kids can access the internet from the palm of their hand with smartphones, smart watches, and tablets. We’re always connected, always online, and always dealing with the compulsion to check social media feeds. These changes have introduced new needs for keeping kids safe online. 

The changing needs of parents and kids

Between WiFi, mobile devices, and social media, using the internet looks very different than it did in the early days of parental controls. And things like the advent of algorithms and the introduction of monetizing data means our lives are intertwined with the internet in ways we couldn’t have imagined back in dial-up days.  

So, what do modern parents really need with parental controls? 

  • Products that seamlessly integrate into their digital lives: This has been a challenge because, while the iPhone has become the dominant device among teens, Apple is notoriously guarded when it comes to allowing third-party apps to monitor activity. This means that very few parental monitoring solutions have been designed that make monitoring truly easy for parents with kids who use Apple devices. 
  • Products that complement what they’re already doing: Apple now offers robust parental control settings, and most social media platforms have their own suites of controls. This leaves less need for all-in-one apps like Bark and Qustodio, which can feel clunky and redundant when parents can now customize these settings (for free) directly on their phone. Other apps, such as BrightCanary, fill in the gaps by monitoring what other tools don’t, such as social media feeds.
  • The ability to monitor messages: Gone are the days where parents knew who their kids were chatting with because they could overhear them on the phone or sneak a peek as they sent instant messages on the family computer. Nowadays, kids primarily communicate over text messages and direct messages, not only on computers, but on phones, tablets, and smartwatches — often out of sight of parents. This shifting landscape has introduced new avenues for kids to be exposed to harmful content and requires new ways for parents to supervise their children.  

Modern solutions for parenting in the digital age

BrightCanary allows parents to keep tabs on their kid’s online life wherever and whenever, all from their own phone. They offer the most comprehensive coverage for kids on Apple devices and, unlike other apps, they actually allow parents to see what their kids are viewing online and view their text message conversations. It’s a modern solution for the needs of modern families. 

In short 

What families need from parental controls has shifted in recent years, but many companies have failed to keep up with these changes. BrightCanary offers modern parental control solutions that work for modern families. 

Children looking at tablet

It will come as no surprise to parents that YouTube is all the rage with kids. In fact, recent research suggests that nine out of 10 kids use YouTube, and kids under 12 favor YouTube over TikTok. With all of YouTube’s popularity, how can you make the platform safer for your child? Read on to learn how to set parental controls on YouTube. 

Why parental controls matter

As the name implies, YouTube is a platform for user-generated content. While this creates an environment ripe for creativity, it also means there’s a little bit of everything, including videos featuring violent and sexual content, profanity, and hate speech. 

Because YouTube makes it easy for kids to watch multiple videos in a row, there’s always the chance your child may accidentally land on inappropriate content. In addition, the comments section on YouTube videos are often unmoderated and can be full of toxic messages and cyberbullying. 

Due to the risks, it’s important that parents monitor their child’s YouTube usage, discuss the risks with them, and use parental controls to minimize the chance they’re exposed to harmful content. 

How to set parental controls on YouTube

YouTube offers a variety of options for families looking to make their child’s viewing experience as safe as possible. Here are some important steps parents can take: 

Create a supervised Google account for YouTube

A supervised account will allow you to manage your child’s YouTube experience on the app, website, smart TVs, and gaming consoles. 

Select a content setting

There are three content setting options to choose from: 

  • Explore: Content rated for viewers 9+. This category also excludes live streams, with the exception of Premieres
  • Explore more: For viewers 13+. This setting includes a larger set of videos, including live streams. 
  • Most of YouTube: For viewers 13-17. This option has almost everything on YouTube, but excludes content marked as 18+ by either channels or YouTube’s systems or reviewers. 

Set parental controls

Along with content settings, here are some additional YouTube parental controls to explore: 

  • Block specific channels: When monitoring your child's YouTube usage, if you encounter content you prefer they avoid, you have the option to block that channel. 
  • Review your child’s watch history: When you can't supervise their viewing at the moment, you can check what your child has been watching.  
  • Control video suggestions: If you don’t like the videos YouTube’s algorithm is suggesting for your child, try these steps to reset their YouTube algorithm:
    • Clear history
    • Pause watch history 
    • Pause search history
  • Disable Autoplay: This setting prevents YouTube from automatically playing the next suggested video.
  • Set time limits: If you need a little help enforcing screen time limits, this option shuts down the app when your child reaches their max. 

For step-by-step instructions for setting up parental controls, refer to this comprehensive guide by YouTube. 

Where parental controls on YouTube fall short

While YouTube offers an impressive array of parental control settings, you have to manually review your child’s content and watch history in order to catch any concerning content. 

BrightCanary is a parental monitoring app that fills in the gaps. Here’s how BrightCanary helps you supervise your child’s YouTube activity:

  • The app provides summaries of what your child is watching and searching for, so you don’t have to watch each video on your own.
  • Advanced technology automatically scans your child’s video activity and flags anything concerning, so you’ll know when you need to step in.
  • You can either view all of their YouTube activity, or just review any videos flagged as concerning.
  • You can monitor searches, videos, and posts — more coverage than other parental control apps on Apple devices.

YouTube vs. YouTube Kids

For parents looking for additional peace of mind, YouTube Kids provides curated content designed for children from preschool through age 12. 

For households with multiple children, parents can set up an individual profile for each child, so kids can log in and watch videos geared toward their age. YouTube Kids also allows parents to set a timer of up to one hour, limiting how long a child can use the app. 

Parents should be aware that switching to YouTube Kids isn’t a perfect solution. There’s still a chance that inappropriate content may slip through the filters. 

In fact, a study by Common Sense Media found that 27% of videos watched by kids 8 and under are intended for older audiences. And for families concerned about ads, YouTube Kids still has plenty of those — targeted specifically toward younger children. Keeping an eye on what your child is watching and talking to them about inappropriate videos and sponsored content is still a good idea even with YouTube Kids. 

It’s also worth noting that kids under 12 who have a special interest they want to pursue may find YouTube Kids limiting. A child looking to watch Minecraft instructional videos or do a deep dive into space exploration, for example, can find a lot more options on standard YouTube — plenty of which are perfectly appropriate for kids, even if they aren’t specifically geared toward them. It’s cases like this where parental controls and active monitoring with BrightCanary are especially useful. 

The takeaway

YouTube is a popular video platform with plenty to offer kids. It’s not without risks, though. Parents should monitor their child’s use and take advantage of parental controls to ensure a safe, appropriate viewing experience. 

shocked mother looking at phone

You do your best to keep an eye on your child’s online activity by asking questions, periodically checking their device, or perhaps using a monitoring service. (Good job, you!) 

But do you know what to do when your child watches inappropriate things? What if you discover your child watched a sexually explicit video? Or that they’ve been viewing content promoting disordered eating or self-harm? 

A discovery like this can be a lot to process, but you don’t have to go it alone. In this article, we’ll discuss what to do when you find something alarming on your child’s phone.

What to do when your child watches inappropriate things

Here are some steps you can take when you find out that your child has watched something inappropriate on their phone: 

Stay calm

It’s likely you’ll have strong feelings when your child watches something inappropriate. You might be worried, shocked, or angry. You might even feel disbelief, guilt, or denial. 

While these feelings are totally normal, it’s not productive to bring them into your conversation with your child. So, before you do anything, take the time you need to regulate your own emotions so you can approach your child calmly and rationally. 

This could mean talking to a trusted confidant, or simply giving yourself a few days' space before you tackle the situation.

Use empathy

As strong as your feelings were when you discovered your child watched something inappropriate, they may also have intense emotions about what they’ve seen. 

The reaction will vary from child-to-child and can range anywhere from confusion, to curiosity, to shame, to fear. Remember they’re still learning and your job is to guide them through this with empathy and love. 

Let them have their emotions and reassure them that, whatever has happened, you’re there for them. 

Listening is key

Before you launch into problem-solving mode, take the time to ask your child what happened, using open-ended questions as much as you can. Your goal is to gather the facts so you can decide how to address it.

Here are some conversation-starters:

  • “How did you come across this?”
  • “Have you seen something like this before, or is this the first time?” 
  • “How did watching it make you feel?”  

Come up with a plan

Once you’ve established the facts, it’s time to figure out your next steps. You’ll want to tailor your response depending on the content that was viewed, if your child sought it out or stumbled upon it, and if someone else sent it to them. 

The most productive response is one that you work with your child to come up with. Ask them if they have any ideas for what to do before you offer your thoughts. 

Here are things you might include in your plan:

  • Discuss what they can do to be safer online and reduce the likelihood of something like this happening again. 
  • Make sure parental controls are in place to filter out inappropriate content. 
  • Reset your child’s algorithms to reduce the chances they’re shown concerning photos or videos.  
  • If you report the inappropriate content (to a social media platform, your internet service provider, or the authorities), involve your child in the process. This will help them feel more in control of what happened. 

Red flags  

When should you consult a professional? Nicole Baker, assistant professor of psychology at Franklin Pierce University, cautions that frequent exposure to harmful material is a serious concern. This includes sexually explicit or inappropriate messages, photos, videos, content that promotes violence, self-harm, harm to others, drug or alcohol use, and any improper sexual material. If your child is sending or receiving this content from peers or strangers, these are significant red flags.

“While it may be understandable that children come across some of this content in unmonitored environments,” Baker says, “it should be a cause for concern when children actively seek out these types of content online and frequently engage with it.”

Baker adds that it’s also a cause for concern if your child hides messages or uses secret apps to try and cover up their online behavior. 

When to involve the authorities

If your child was contacted by an online predator, or if the content they viewed involved nude, semi-nude, or sexually explicit videos or images of a child, document and report the material to the platform and law enforcement. You can also report the incident to the National Center for Missing and Exploited Children (NCMEC)’s Cyber Tip Line.

In short

The internet is full of all kinds of questionable content, and there’s a decent chance your child will run across some of it at one point or another. That’s why it’s important to prepare your child by talking to them about what to do if anyone — or anything — makes them feel uncomfortable online.

If you find something alarming on your child’s phone, there are steps you can take to minimize the harm and increase their safety going forward.

Be the most informed parent in the room.
Sign up for bimonthly digital parenting updates.
Please enable JavaScript in your browser to complete this form.
@2024 Tacita, Inc. All Rights Reserved.