How to Reset Your Child’s Social Media Algorithm

By Andrea Nelson
October 19, 2023
Three kids using smartphones

As a parent, you want your child to surround themselves with good influences. That’s true not only for who they spend time with in real life, but also for the people and ideas they’re exposed to on social media. 

If you or your child are concerned about the content appearing in their feed, one beneficial step you can take is to help them reset their social media algorithm. Here’s how to reset your child’s algorithm on TikTok, Instagram, and other platforms.

What is a social media algorithm?

Social media algorithms are the complex computations that operate behind the scenes of every social media platform to determine what each user sees. 

Everything on your child’s social media feed is likely the result of something they liked, commented on, or shared. (For a more comprehensive explanation, check out our Parent’s Guide to Social Media Algorithms.)

Social media algorithms have a snowball effect. For example, if your child “likes” a cute dog video, they’ll likely see more of that type of content. However, if they search for topics like violence, adult material, or conspiracy theories, their feed can quickly be overwhelmed with negative content.

Therefore, it’s vital that parents actively examine and reset their child’s algorithm when needed, and also teach them the skills to evaluate it for themselves. 

Research clearly demonstrates the potentially negative impacts of social media on tweens and teens. How it affects your child depends a lot on what’s in their feed. And what’s in their feed has everything to do with algorithms. 

Talking to your child about their algorithm

Helping your child reset their algorithm is a wonderful opportunity to teach them digital literacy. Explain to them why it’s important to think critically about what they see on social media, and what they do on the site influences the content they’re shown. 

Here are some steps you can take together to clean up their feed: 

Start with their favorite app

Resetting all of your child’s algorithms in one fell swoop can be daunting. Instead, pick the app they use the most and tackle that first. 

Scroll through with them

If your kiddo follows a lot of accounts, you might need to break this step into multiple sessions. Pause on each account they follow and have them consider these questions:

  • Do this person’s posts usually make me feel unhappy or bad about myself? 
  • Does this account make me feel like I need to change who I am? 
  • Do I compare my life, body, or success with others when I view this account? 

If the answer “yes” to any of these questions, suggest they unfollow the account. If they’re hesitant — for example, if they’re worried unfollowing might cause friend problems — they can instead “hide” or “mute” the account so they don’t see those posts in their feed. 

Encourage interaction with positive accounts 

On the flip side, encourage your child to interact with accounts that make them feel good about themselves and portray positive messages. Liking, commenting, and sharing content that lifts them up will have a ripple effect on the rest of their feed. 

Dig into the settings 

After you’ve gone through their feed, show your child how to examine their settings. This mostly influences sponsored content, but considering the problematic history of advertisers marketing to children on social media, it’s wise to take a look.  

Every social media app has slightly different options for how much control users have over their algorithm. Here's what you should know about resetting the algorithm on popular apps your child might use.

How to reset Instagram algorithm

  • Go to Settings > Ads > Ad topics. You can view a list of all the categories advertisers can use to reach your child. Tap “See less” for ads you don’t want to see. 
  • Go to your child’s profile > tap Following > scroll through the categories to view (and unfollow) the accounts that appear most in your child’s feed.
  • Tap the Explore tab in the bottom navigation bar and encourage your child to search for new content that matches their interests, like cooking, animals, or TV shows.

How to reset TikTok algorithm

  • Go to Settings > Content Preferences > Refresh your For You feed. This is like a factory reset of your child’s TikTok algorithm.
  • Go to Settings > Free up space. Select “Clear” next to Cache. This will remove any saved data that could influence your child’s feed.
  • As your child uses TikTok, point out the “Not Interested” feature. Tap and hold a video to pull up this button. Tapping “Not interested” tells TikTok’s algorithm not to show your child videos they don’t like. 

How to reset YouTube algorithm

  • Go to Library > View All. Scroll back through everything your child has watched. You can manually remove any videos that your child doesn’t want associated with their algorithm — just then tap the three dots on the right side, then select Remove from watch history.
  • Go to Settings > History & Privacy. Tap “Clear watch history” for a full reset of your child’s YouTube algorithm.

What to watch for

To get the best buy-in and help your child form positive long-term content consumption habits, it’s best to let them take the lead in deciding what accounts and content they want to see. 

At the same time, kids shouldn't have to navigate the internet on their own. Social platforms can easily suggest content and profiles that your child isn't ready to see. A social media monitoring app, such as BrightCanary, can alert you if your child encounters something concerning.

Here are a few warning signs you should watch out for as you review your child's feed: 

If you spot any of this content, it’s time for a longer conversation to assess your child’s safety. You may decide it’s appropriate to insist they unfollow a particular account. And if what you see on your child’s feed makes you concerned for their mental health or worried they may harm themselves or others, consider reaching out to a professional.  

In short 

Algorithms are the force that drives everything your child sees on social media and can quickly cause their feed to be overtaken by negative content. Regularly reviewing your child’s feed with them and teaching them skills to control their algorithm will help keep their feed positive and minimize some of the negative impacts of social media. 

Woman smiling at phone while sitting on couch

Just by existing as a person in 2023, you’ve probably heard of social media algorithms. But what are algorithms? How do social media algorithms work? And why should parents care? 

At BrightCanary, we’re all about giving parents the tools and information they need to take a proactive role in their children’s digital life. So, we’ve created this guide to help you understand what social media algorithms are, how they impact your child, and what you can do about it. 

What is a social media algorithm? 

Social media algorithms are complex sets of rules and calculations used by platforms to prioritize the content that users see in their feeds. Each social network uses different algorithms. The algorithm on TikTok is different from the one on YouTube. 

In short, algorithms dictate what you see when you use social media and in what order. 

Why do social media sites use algorithms?

Back in the Wild Wild West days of social media, you would see all of the posts from everyone you were friends with or followed, presented in chronological order. 

But as more users flocked to social media and the amount of content ballooned, platforms started introducing algorithms to filter through the piles of content and deliver relevant and interesting content to keep their users engaged. The goal is to get users hooked and keep them coming back for more.  

Algorithms are also hugely beneficial for generating advertising revenue for platforms because they help target sponsored content. 

How do algorithms work? 

Each platform uses its own mix of factors, but here are some examples of what influences social media algorithms:

Friends/who you follow 

Most social media sites heavily prioritize showing users content from people they’re connected with on the platform. 

TikTok is unique because it emphasizes showing users new content based on their interests, which means you typically won’t see posts from people you follow on your TikTok feed. 

Your activity on the site

With the exception of TikTok, if you interact frequently with a particular user, you’re more likely to see their content in your feed. 

The algorithms on TikTok, Instagram Reels, and Instagram Explore prioritize showing you new content based on the type of posts and videos you engage with. For example, the more cute cat videos you watch, the more cute cat videos you’ll be shown. 

YouTube looks at the creators you interact with, your watch history, and the type of content you view to determine suggested videos. 

The popularity of a post or video 

The more likes, shares, and comments a post gets, the more likely it is to be shown to other users. This momentum is the snowball effect that causes posts to go viral. 

Why should parents care about algorithms? 

There are ways social media algorithms can benefit your child, such as creating a personalized experience and helping them discover new things related to their interests. But the drawbacks are also notable — and potentially concerning. 

Since social media algorithms show users more of what they seem to like, your child's feed might quickly become overwhelmed with negative content. Clicking a post out of curiosity or naivety, such as one promoting a conspiracy theory, can inadvertently expose your child to more such content. What may begin as innocent exploration could gradually influence their beliefs.

Experts frequently cite “thinspo” (short for “thinspiration”), a social media topic that aims to promote unhealthy body goals and disordered eating habits, as another algorithmic concern.

Even though most platforms ban content encouraging eating disorders, users often bypass filters using creative hashtags and abbreviations. If your child clicks on a thinspo post, they may continue to be served content that promotes eating disorders

Social media algorithm tips for parents

Although social media algorithms are something to monitor, the good news is that parents can help minimize the negative impacts on their child. 

Here are some tips:

Keep watch

It’s a good idea to monitor what the algorithm is showing your child so you can spot any concerning trends. Regularly sit down with them to look at their feed together. 

You can also use a parental monitoring service to alert you if your child consumes alarming content. BrightCanary is an app that continuously monitors your child’s social media activity and flags any concerning content, such as photos that promote self-harm or violent videos — so you can step in and talk about it.

Stay in the know

Keep up on concerning social media trends, such as popular conspiracy theories and internet challenges, so you can spot warning signs in your child’s feed. 

Communication is key

Talk to your child about who they follow and how those accounts make them feel. Encourage them to think critically about the content they consume and to disengage if something makes them feel bad. 

In short

Algorithms influence what content your child sees when they use social media. Parents need to be aware of the potentially harmful impacts this can have on their child and take an active role in combating the negative effects. 

Stay in the know about the latest digital parenting news and trends by subscribing to our weekly newsletter

Shot of Capitol Hill

Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:

  • Does it feel like your child is texting in a different language? Save this guide to some of the most common slang and emojis kids are using today, including ones that could mean they’re up to trouble.
  • 62% of parents feel burned out by parenting, according to a new survey by Ohio State University. 
  • KOSA. PATA. COPPA 2.0. No, this isn’t Wordle — we’re breaking down a few notable pieces of child online safety legislation currently under consideration in Congress.

Digital Parenting

What is Congress doing to keep kids safe online?

Before we talk about child online safety legislation, let’s talk about seat belts. 

In the 1980s, states began implementing laws requiring people to wear seat belts in cars. Despite studies from the 1950s demonstrating that seat belts save lives, it wasn’t until these laws were implemented that buckling up became routine. You enter a car, you fasten your seat belt. It’s a simple safety step that’s also mandated by law.

However, between the 1950s and 1980s, there was a time when people knew that seat belts were protective — but they didn’t necessarily use them. Later, laws were passed because safety protections can help save lives.

A similar discussion is happening today with social media. A growing body of research points to social media’s negative effects on kids, ranging from their well-being to their brain development. But there are no national regulations to safeguard children on social media, and those that are passed at the state level face significant legal pushback from major tech companies.

In Congress, several pieces of legislation that impact children online are currently under discussion. Let’s look at a few of them making headway this legislative session:

Kids Online Safety Act (KOSA): Sets new safety standards for social media companies and holds them accountable for protecting minors. Users would also be allowed to opt-out of addictive design features, such as algorithm-based recommendations, infinite scrolling, and notifications. The bill awaits vote in the Senate and has been introduced in the House.

Children and Teens’ Online Privacy Protection Act (COPPA 2.0): Updates the Children’s Online Privacy Protection Act (COPPA). This measure would make it illegal for websites to collect data on children under the age of 16, outlaw marketing specifically aimed at kids, and allow parents to erase their kids’ information on websites. The bill awaits vote in the Senate.

Sammy’s Law: Would require social media companies to integrate with child safety software, making it easier for parents to supervise their children’s online activities. The bill is currently in the House subcommittee on Innovation, Data, and Commerce.

Platform Accountability and Transparency Act (PATA): Provides protected ways for researchers to study data from big internet companies, focusing on how these platforms impact society. PATA would make it clearer how online platforms manage children's data and the effects of their algorithms. The bill was read twice in the Senate and referred to committee. 

Also worth noting is the American Privacy Rights Act (APRA), a significant bipartisan measure yet to reach committee. It would establish national privacy and security standards, requiring transparent data usage and giving consumers, particularly children, greater control over their personal information.

In the future, we may look back at this period and wonder how we didn’t have stricter measures in place to protect kids online — just like that period when we didn’t wear seat belts. You can talk to lawmakers about the importance of children’s online safety legislation. To find your representative, go to congress.gov/members/find-your-member.


Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.


Practical Parenting Tips

Understanding slang and secret codes in your child’s texts

You know you should monitor your child’s texts, but actually understanding their messages is a whole other story. Like previous generations of kids, Gen Z and Gen Alpha use slang to put their own spin on the way they communicate. We break down what it all means, bruh.

9 mistakes parents make with text message monitoring

While it’s responsible to monitor your child’s text messages, that doesn’t mean anything goes. Here are some of the top mistakes parents make when monitoring their child’s texts so you can avoid making them yourself. 


Tech Talks With Your Child

How will you check in with your child this week? Save these conversation-starters for your next tech check-in. 

  1. Do you ever have trouble sleeping because you’re on your phone before bed? 
  2. How do you feel when you get a lot of notifications on your phone?
  3. I’d like to implement a no-phone rule at the dinner table so we can be more present with each other. What do you think about that?
  4. Is there anything cool you saw online that you want to try this week, like a recipe or a new place to visit?
  5. Let’s talk about online privacy best practices. Do you use the same password for multiple accounts, or do you use different passwords?

What’s Catching Our Eye

📵 Following a smartphone ban in Norway schools, middle school kids report feeling mentally healthier and performing better academically. After three years of the policy, girls’ visits to mental health professionals decreased by 60%, and both boys and girls experienced 43–46% less bullying.

🕯️ According to a new survey by Ohio State University, a majority of parents experience isolation, loneliness, and burnout from the demands of parenthood. A whopping 62% feel burned out by their responsibilities as a parent. Parental burnout researcher Kate Gawlik, DNP, stressed the need for self-care and the value of connection, encouraging parents to find local parent groups.

🐤🤖 Did you know? BrightCanary features an AI chatbot called Ask the Canary: an easy way to anonymously get answers to your toughest parenting questions. Find it in the BrightCanary app.

Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:

  • 50% of parents say they look through their teen’s smartphone, and 47% set time limits on their phone use, according to Pew Research Center.
  • Apple parental controls aren’t foolproof. We troubleshoot common complaints with Apple Screen Time.
  • The APA’s latest health advisory calls out social media companies for not doing enough to protect kids.
  • Is TikTok getting banned? The House has officially passed a measure that would require TikTok to divest from its parent company. Here’s what happens next.


Digital Parenting

House passes measure to sell or ban TikTok

Another week, another round of TikTok drama: last week, the House passed a bill requiring the forced sale or ban of TikTok in the U.S. 

The bill, titled the Protecting Americans from Foreign Adversary Controlled Applications Act (H.R. 7521), requires TikTok’s Chinese parent company, ByteDance, to sell the app’s U.S. operations within nine months (previously six, but the latest version of the bill extended the timeline with the potential to become a full year). Otherwise, much like dancing in Footloose, it would be illegal for TikTok to be available for download in U.S. app stores. 

Lawmakers claim that TikTok poses a national security threat because the Chinese government could potentially access the data of U.S. users and use the platform's algorithm to influence American public opinion. TikTok stated it has never been asked to provide U.S. user data to the Chinese government, wouldn’t do so if asked, and doesn’t tailor content based on political motives. 

What happens next? The proposal sailed through a House panel earlier this month, but faced an uncertain future in Congress until it was attached to a foreign aid package that will send funds to Ukraine and Israel, making it more likely to be passed in the Senate. If passed, the bill could land on President Biden’s desk in the next week.

This doesn’t mean TikTok will be banned in time for Mother’s Day. The platform would have nine months to find a buyer, although it’s not clear if TikTok’s algorithm — aka the thing that makes it so compulsively scrollable and knows exactly which ASMR cooking videos to show you — will come with it.

If your child asks about the TikTok ban: Explain the topic in a way that’s appropriate for your child. The platform hasn’t been banned, but lawmakers are asking TikTok to find a new owner because they’re worried about the way they’re treating our personal information. Now’s a great time to explain how social media algorithms work, why it’s important to think critically about the information we consume, and how a bill moves through Congress.


Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.


APA calls on social media companies to take responsibility to protect youth

The American Psychological Association (APA) recently released another report on social media, calling on tech companies to fundamentally redesign social media to correct harmful features that are unsafe for adolescents. 

Last year, the APA issued a health advisory on social media use in adolescence, in which the organization recognized the potential social benefits of social media but called out the need to protect kids from harmful content and problematic behaviors. This new report highlights the fact that companies and policymakers “still have made few meaningful changes” (translation: haven’t taken actions that’ll actually help kids). 

The report highlights the ways in which common features of social media, such as infinite scroll and notifications, negatively impact kids. It also suggests paths forward for companies and policymakers. Some takeaways:

  • Several proposed child online safety bills ban kids from using social media under a certain age. However, the APA argues that a single age isn’t associated with social media readiness — a child isn’t magically ready to use social media the moment they turn 15. Instead, social media use, function, and permissions should be tailored for kids. “Design features created for adults may not be appropriate for children,” the report states.
  • Parents should monitor their children's social media use. It's crucial to teach adolescents how to responsibly engage with these platforms, including how to limit exposure to harmful content such as that promoting negative body image or self-harm. Additionally, they should learn healthy behaviors to prevent social media from affecting their sleep and physical activity.
  • We need common-sense policies that require social media companies to make these platforms safer for their youngest users. Parental controls are helpful, but it’s not enough for tech companies to delegate responsibility to parents, app stores, or youth themselves. “That responsibility sits with the creators and purveyors of these technologies — the platform developers themselves,” said Mary Ann McCabe, PhD, part of the expert panel that put together the 2023 health advisory.


Practical Parenting Tips

Apple Screen Time not working? Monitoring tips and tricks

Apple Screen Time is a great tool to set limits and restrict certain activities. But Apple parental controls aren’t foolproof. We break down common complaints and new ways to keep your kiddo safe online.

Is Nintendo Switch safe for kids?

Whether your kid is already obsessed with their Switch or wants a console to play with friends, you should know that Nintendo Switch parental controls exist, and you can use them to set time limits, limit certain games, and more. 


Tech Talks With Your Child

How will you check in with your child this week? Save these conversation-starters for your next tech check-in. 

  1. "Have you noticed if using your phone before bed makes it harder for you to fall asleep or stay asleep?"
  2. “Has anyone said something in your texts or messages that made you feel uncomfortable or upset?"
  3. "How do you use social media to stay connected with your friends? Do you think it helps you keep in touch better?"
  4. "Have you discovered any new hobbies or interests online? What are they?"
  5. “What are some of your favorite accounts on YouTube or social media right now?”


What’s Catching Our Eye

🤔 What are social media algorithms, and how should you talk to your kids about them? BrightCanary CEO Karl Stillner writes for the Family Online Safety Institute about what parents should know. 

🚫 Meta has rolled out new tools to help protect against sextortion and intimate image abuse on Instagram and Facebook.

👀 Do you ever look through your teen’s smartphone? According to Pew Research Center, 50% of parents say they do, and 47% say they set time limits on their teens’ phone use.

📞 The latest Gen Z trend: dumbphones. In other words, flip phones are back (here are our recommendations). 

teen girl using laptop in bedroom

Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:

  • 55% of students use social media to self-diagnose mental health conditions. We break down what this means and how parents can talk to their kids about what’s on their feed. 
  • As of Jan. 1, 2025, kids under 14 are banned from having social media accounts in Florida — assuming the bill isn't held up in court.
  • This week in Tech Talks: conversation-starters to check in with your child about mental health, checking their sources, and more.

Digital Parenting

Kids are using social media to self-diagnose

If your teen suddenly has a new lexicon of mental health terms, like “trauma response” and “major depressive disorder,” TikTok may be to blame. A poll by EdWeek found that 55% of students use social media to self-diagnose mental health conditions, and 65% of teachers say they’ve seen the phenomenon in their classrooms. 

“Kids are all coming in and I’m asking them, ‘Where did you get this diagnosis?’” said Don Grant, national adviser for healthy device management at Newport Healthcare, in an interview with The Hill. Grant said he would get responses such as “Oh, there’s an [influencer],” “Oh, I took a quiz,” or “Oh, there’s a group on social media that talks about it.”  

Social media can help kids understand their feelings and find ways to cope. The EdWeek poll found that 72% of educators believe social media has made it easier for students to be more open about their mental health struggles. And it makes sense that kids would turn to a space they know — social media and online groups — to get information, rather than finding a mental health professional first (or talking to their parents). 

However, the topic gets tricky when you consider the fact that social media sites don’t exactly verify that the people sharing medical advice are, in fact, medical experts. While there are plenty of experts sharing legitimate information online, there are also influencers who are paid to talk about products that improved their anxiety and off-label medications that cured their depression. 

Big picture: Self-diagnosing on social media is also problematic because algorithms can create a self-fulfilling prophecy. Most algorithms, like TikTok, use a user’s activity to determine what they see next on their feed. If a teen thinks they have depression, they’ll see more content about depression — which may confirm their self-diagnosis, even if they aren’t clinically depressed.

As parents, it’s important to talk to your child about mental health, how to cope with big emotions, and what to do if they need a professional. But it’s also essential to know where they’re getting their mental health information and what they’re seeing on their social media feeds. 

Don’t dismiss their feelings outright — be curious. Talk to your child about verifying their sources of information. If they’re getting medical advice from an online creator, are they an actual doctor or therapist? Or are they simply someone who’s popular online?


Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.


Florida passes one of the most restrictive social media bans for minors

Gov. Ron DeSantis recently signed a bill that bans kids under 14 from creating social media accounts and requires parental consent for kids under 16. The bill requires that companies delete accounts belonging to 14- and 15-year-olds and implement age verification measures to ensure that kids aren’t lying about their ages. 

Florida’s bill is the most restrictive social media ban in the nation, and that’s after DeSantis vetoed an earlier version of the bill that would have banned all accounts for kids under 16. At the bill-signing ceremony, Republican Speaker Paul Renner said, “A child in their brain development doesn’t have the ability to know that they’re being sucked into these addictive technologies and to see the harm and step away from it, and because of that we have to step in for them.”

Legal upheaval: The bill takes effect Jan. 1, 2025, pending any legal challenges. Tech industry groups have already come out against the bill, including NetChoice, an association that represents major social media platforms and is currently battling with the Supreme Court over a separate social media law. 

“This bill goes too far in taking away parents’ rights,” Democratic Rep. Anna Eskamani said in a news release. “Instead of banning social media access, it would be better to ensure improved parental oversight tools, improved access to data to stop bad actors, alongside major investments in Florida’s mental health systems and programs.”

In our last issue, we covered Utah’s decision to repeal and replace its social media law after months of legal challenges that delayed the bill’s implementation. Although DeSantis and Renner have signaled that they’re ready to fight to keep Florida’s social media ban in place, time will tell whether or not Florida’s kids will have to wait until their sweet 16 to get on Snapchat. 


Tech Talks With Your Child

How will you check in with your child about online safety this week? Save these conversation-starters for your next check-in. 

  1. "Have you ever come across anything online that made you feel uncomfortable or worried?”
  2. "Do you know how to check if information you find online is true or reliable? Let's talk about how to evaluate sources together."
  3. "How do you feel after spending time on social media? Does it ever affect your mood or feelings about yourself?"
  4. "What would you do if you received a message or saw a post that talked about depression or anxiety? Do you know who to talk to?"
  5. “What are some ways you like to spend time with your friends offline? Can we plan any upcoming events or get-togethers?”

Practical Parenting Tips

How does screen time affect sleep?

Sleep can impact everything from brain performance, to mood, to mental and physical health. Our children aren’t getting enough sleep, either, and screens are one of the prime suspects. But how does screen time affect sleep?

A parent’s guide to Pinterest parental controls

Pinterest use is up among teens. Gen Zers are using the website as a canvas for self-expression and exploration. Learn more about how to keep your child safe on the site with Pinterest parental controls.  


What’s Catching Our Eye

😮‍💨 What is the “mental load” of parenting, and how does it affect your emotions, sleep quality, and job performance?

🚩 What are the red flags that you need to worry about your child’s mental health? Save this list from Techno Sapiens.

🤝 Rules and restrictions aren’t the end-all, be-all to parenting in the digital age — you also need a healthy, emotionally rich relationship with your teen. Read more at Psychology Today.

📵 When it comes to protecting kids’ mental health, Florida’s social media ban won’t be that simple, writes David French for the New York Times

close up of pinterest grid

In a surprising resurgence of the platform’s cool factor, Pinterest use is up among teens. Gen Zers are using the website as a canvas for self-expression and exploration. Read on to learn more about how to keep your child safe on the site with Pinterest parental controls.  

What is Pinterest? 

Pinterest describes itself as “a visual discovery engine for finding ideas.” Users save “Pins” of images or videos to virtual boards. They can record live videos and take photos right in the app, or save images found elsewhere on the internet as Pins. 

How your child might use Pinterest

Many kids come to Pinterest to find inspiration and share ideas around a hobby or interest. Teens are more likely than their adult counterparts to create Pins of things they’ve made and their outfits. Kids also use it to connect with others around common interests, such as books, beauty, or fashion.

How your child might interact with others on Pinterest

Pinterest allows users to interact with each other through comments, direct messages, and shared boards. Although Pinterest may seem relatively tame in comparison to TikTok or Snapchat, parents should take the same precautions as they do with other social media sites.   

Here are some ways people might interact with your child on Pinterest: 

  • Group boards: Boards can be secret or public. Secret boards become group boards when users are invited as collaborators. 
  • Followers: Anyone can follow a public account. Accounts set to private aren’t discoverable, but users can invite people to follow their private account. 
  • Reactions and comments: Users can react to and comment on Pins. 
  • Direct message: Users can exchange private messages with one another. 
  • Mentions: People can use the @ symbol to mention other users in Pin comments and descriptions, which notifies the person mentioned.
  • Sharing: Pins can be shared on other social media networks, sent to users on Pinterest, and shared via emailed to people not on Pinterest.

Risk of letting your child using Pinterest 

Just like any social media site, there are risks parents need to be aware of. In 2023, NBC News reported that adult men were using Pinterest to create boards with pictures of young girls and teenagers. The platform responded by rolling out a suite of new Pinterest parental controls, which we’ll discuss below.

Aside from online predators, Pinterest can also expose your child to content that promotes negative body image, negative self-esteem, and even suicidal thoughts. Like other websites, Pinterest uses an algorithm to recommend content based on what your child searches and the pins they click. Research shows that excessive social media use can make kids feel bad about themselves, so it’s important to talk to your child about the content on their feed and limit the time they spend on social media — including Pinterest.

Exposure to inappropriate content is also a risk on Pinterest. Pins can lead kids to websites with explicit content, misinformation, and just plain spam, solely because they clicked a pin that caught their attention. 

Benefits of letting your child using Pinterest 

There are also plenty of positive reasons to let your child use Pinterest, with guardrails. 

For example, Pinterest can be a great source of inspiration, creative expression, and connection because users have the ability to dive deeper into their interests. Plus, Pinterest is full of tutorials that can help kids learn new skills, like cooking and coding. 

Pinterest can even foster a boost of positivity. Recent research from Pinterest and University of California, Berkeley, found that daily interaction with inspiring content on Pinterest helped buffer students against things like burnout and stress. 

How to use Pinterest parental controls

The good news is that Pinterest parental controls are fairly robust. The company recently took steps to protect minors on their site, including age verification, automatically setting accounts to private for users under 16, and additional reporting options. The minimum age for Pinterest users is 13.

There are also extra steps you can take to keep your child safe on Pinterest: 

  • Verify their age: Confirm they entered their age correctly when they signed up for an account to ensure the teen safety settings are in place. 
  • Monitor their account: Follow your child on Pinterest and sit down with them periodically to view their feed together.  
  • Set up a parental passcode: This code locks certain privacy, data, and social permissions settings. 
  • Help them set their privacy: Check that their account is set to private and show them how to adjust their settings to control who can view their content. They can also edit their profile to control what information is displayed.  
  • Encourage them to use secret boards: Secret boards can only be viewed by your child and people they invite. They should only invite people they know in real life.
  • Talk with them about safety: Encourage your child to only share content with people they know and trust. Discuss the risks of allowing people they don’t know access to their boards and remind them to be cautious about what they share. 
  • Establish open communication: Be upfront about the risks your child may face on Pinterest. Make it clear they can come to you if they have a problem, and you’ll help them through it. 
  • Report and block: Show your child how to report inappropriate Pins, users, and messages. Make sure they also know they can block users who make them uncomfortable.

The bottom line

While Pinterest can be a positive creative outlet for kids, it’s not without risk. Parents should educate themselves about the potential dangers and take steps to keep their child safe on the site. 

Friend recording a girl on smartphone

Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:

  • Are the days of lip synching to trending songs coming to an end? We break down the proposed TikTok ban heading to a Senate vote.
  • Speaking of social media, have you changed your child’s privacy settings on Instagram? We share tips to make your child’s Instagram account safer.
  • 44% of teens say they feel anxious without their smartphones, according to a new Pew Research Center survey.

Digital Parenting

Inside the proposed TikTok ban

Today, the House overwhelmingly voted to pass a bill that would effectively ban TikTok in the United States. The bill now heads to the Senate, where its future is less certain. The measure, H.R. 7521, would ban applications controlled by foreign adversaries of the United States that pose a clear national security risk. 

For years, US officials have dubbed TikTok a national security threat. China’s intelligence laws could enable Beijing to snoop on the user information TikTok collects. Although the US government has not publicly presented evidence that the Chinese government has accessed TikTok user data, the House vote was preceded by a classified briefing on national security concerns about TikTok's Chinese ownership.

If H.R. 7521 is passed, ByteDance will have 165 days to sell TikTok. Failure to do so would make it illegal for TikTok to be available for download in U.S. app stores. On the day of the vote, TikTok responded with a full-screen pop-up that prompted users to dial their members of Congress and express their opposition to the bill. In a post on X, TikTok shared: “This will damage millions of businesses, deny artists an audience, and destroy the livelihoods of countless creators across the country.”

"It is not a ban,” said Representative Mike Gallagher, the Republican chairman of the House select China committee. “Think of this as a surgery designed to remove the tumor and thereby save the patient in the process."

The bottom line: The bill passed the House Energy and Commerce Committee unanimously, which means legislators from both parties supported the bill. Reuters calls this the “most significant momentum for a U.S. crackdown on TikTok … since then President Donald Trump unsuccessfully tried to ban the app in 2020.” The TikTok legislation's fate is less certain in the Senate. If the bill clears Congress, though, President Biden has already indicated that he would sign it.

If your child uses TikTok, it’s natural that they may have questions about the ban (especially if they dream of becoming a TikTok influencer). Nothing is set in stone, and it’s entirely possible that TikTok would simply change ownership. However, this is a good opportunity to chat with your kids about the following talking points:

  • It’s true that social media can be entertaining and educational. 
  • But social media companies can buy and sell your data, use algorithms to change your opinions about topics, and design their apps to make you spend more time using them.
  • We elect representatives to represent us. That’s why it’s important to vote, stay informed about current events, and think critically about the information you consume.

Practical Parenting Tips

Is Instagram safe for kids? A parent’s guide to safety recommendations

Set your child’s account to private, limit who can message them, and limit reposts and mentions. With a few simple steps, you can make Instagram a safer place for your kid. Here’s how to get it done.

How to talk to your child about sending inappropriate text messages

Yikes — you found out that your child has been sending concerning videos, images, or messages to someone else. We break down some of the reasons kids send inappropriate messages and how to approach them.


What’s Catching Our Eye

🏛️ An update on Florida’s social media ban: as expected, Governor Ron DeSantis vetoed a bill that would have banned minors from using social media, but signaled that he would sign a different version anticipated from the Florida legislature.

📵 Nearly three-quarters (72%) of U.S. teens say they feel happy or peaceful when they don’t have their smartphones — but 44% say they feel anxious without them, according to Pew Research Center.

📖 Do digital books count as screen time? The benefits of reading outweigh screen time exposure, according to experts.

🗺️ How can parents navigate the challenges of technology and social media? Set limits, help your child realize how much time they spend on tech, and model self-restraint. Check out these tips and more via Psychology Today.

Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.

Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:

  • Florida may ban social media for minors. The bill, HB1, is currently awaiting Governor DeSantis’ veto or approval.
  • The Kids Online Safety Act just hit a big milestone: it officially has enough supporters to pass the Senate, although the bill hasn’t yet moved to a vote.
  • Meta announced the expansion of a program to help teens avoid sextortion scams on Facebook and Instagram.

Digital Parenting

New milestone for online safety legislation

A Florida bill that bans minors from using social media recently passed the House and Senate. The bill, HB1, is now on Governor Ron DeSantis’ desk. He’ll have until March 1 to veto the legislation or sign it into law. 

DeSantis has previously said that he didn’t support the bill in its current form, which bars anyone younger than 16 years old from creating new social media accounts — and closes existing accounts for kids 16 and younger. (DeSantis has called social media a “net negative” for young people, but said that, with parental supervision, it could have beneficial effects.) Unlike online safety bills passed in other states, HB1 doesn’t allow minors to use social media with parental permission: if you’re a minor, you can’t have an Instagram.

Even if DeSantis vetoes the bill, the fact that such an aggressive bill passed both the House and Senate with bipartisan support signals that the conversation about online safety legislation is reaching a tipping point. 

The Kids Online Safety Act (KOSA), which implements social media regulations at the federal level, also recently reached a major milestone: an amended version gained enough supporters to pass the Senate. If it moves to a vote, it would be the first child safety bill to get this far in 25 years, since the Children's Online Privacy Protection Act passed in 1998.

If passed, KOSA would make tech platforms responsible (aka have a “duty of care”) for preventing and mitigating harm to minors on topics ranging from mental health disorders and online bullying to eating disorders and sexual exploitation. Users would also be allowed to opt-out of addictive design features, such as algorithm-based recommendations, infinite scrolling, and notifications. 

In a previous iteration of KOSA, state attorneys general were able to enforce the duty of care. However, some LGBTQ+ groups were concerned that Republican AGs would use the law to take action against resources for LGBTQ+ youth. The amended version leaves enforcement to the Federal Trade Commission — a move that led a number of advocacy groups, including GLAAD, Human Rights campaign, and The Trevor Project — to state they wouldn’t oppose the new version of KOSA if it moves forward. (So, not an endorsement, but not-not an endorsement.)

What’s next? As of this publication, DeSantis has not signed or vetoed Florida’s social media ban. Plus, KOSA has yet to be introduced to the Senate for a vote, and it’s flying solo — there is no companion bill in the House, which would give the House and Senate time to consider a measure simultaneously. 

However, the fallout from January’s Senate Judiciary Committee — in which lawmakers grilled tech CEOs about their alleged failure to stamp out child abuse material on their platforms — may build momentum for future online safety legislation. We’ll keep our eyes peeled.

Practical Parenting Tips

How to use Spotify parental controls

Spotify offers everything from podcasts to audiobooks — and with all of that media comes content concerns. The good news: both Spotify Kids and Spotify parental controls allow kids to enjoy their tunes while keeping their ears clean.

Is One Piece for kids?

If you remember watching the pirate-themed anime series One Piece, you might be excited about the recently released live-action remake now streaming on Netflix and eager to share your love of the show with your kids. But is One Piece for kids?

What’s Catching Our Eye

🔒 Did you know that 90% of caregivers use at least one parental control? That’s according to a new survey from Microsoft.

📱 Social media is associated with a negative impact on youth mental health — but a lot of the research we have tends to focus on adults. In order to really understand cause and effect, researchers need to talk to teens about how they use their phones and social networks. Read more via Science News.

🛑 Meta announced the expansion of the Take It Down program, which is “designed to help teens take back control of their intimate images and help prevent people — whether it’s scammers, ex-partners, or anyone else — from spreading them online.”

Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.

Child listening to music

At first glance, the idea of setting Spotify parental controls might seem surprising. After all, isn’t Spotify just a music streaming platform? In reality, Spotify offers everything from podcasts to audiobooks — and with all of that media comes content concerns. 

Maybe you’ve heard of reports of pornography showing up on Spotify, or maybe you would rather your kids don’t repeat the f-bomb they picked up while belting along to the latest Olivia Rodrigo track. Whatever your motivation for being leery of giving your child free reign on Spotify, you don’t need to give up on the platform altogether. Both Spotify Kids and Spotify parental controls allow kids to enjoy their tunes while keeping their ears clean.

What is Spotify Kids? 

Spotify Kids is an ad-free service available exclusively with a Spotify Premium Family subscription. Designed for kids aged 12 and under, it features kid-friendly content specifically curated for the youngest listeners. Spotify Kids features music, audiobooks, and more, and allows parents to view and manage the content their child listens to. 

Is Spotify Kids safe? 

Not only does Spotify Kids not contain any content marked Explicit, but it’s also curated by humans, so you don’t have to worry about something sneaking past an algorithm or a filter. 

What age is Spotify Kids for? 

Spotify Kids is specifically designed for kids 12 and under. (Spotify’s terms require users of regular accounts to be at least 13.)

How much does Spotify Kids cost? 

In order to use Spotify Kids, you must have a Spotify Premium Family subscription. Spotify Premium Family is a discounted plan available for up to six family members. It costs $16.99/month, and you can cancel at any time.

How to set up Spotify Kids

Getting started on Spotify Kids is a breeze. Just follow these easy steps: 

  1. Subscribe to Spotify Family Premium
  2. Download the Spotify Kids app to your child’s iOS or Android device.
  3. Follow the prompts to set up a PIN for accessing the settings within the app.
  4. Create an avatar for your child. (They’ll have fun helping you pick!)
  5. Select the appropriate age category for your child: under 6 or 5-12.

Spotify parental controls 

If you have kids over 13, don’t worry — there are still parental control options to keep their listening experience appropriate. To use Spotify parental controls, you must have a Spotify Family account.

Here’s how to set up Spotify parental controls: 

On a mobile device

  1. Open the Spotify website on your mobile browser. 
  2. Tap the Menu icon (≡) in the top right corner. 
  3. Tap Log in
  4. Enter your account details.
  5. Tap Account Overview.
  6. Select Premium Family from the dropdown menu.
  7. In the People on this plan section, tap on the name of the family member whose account you want to manage.
  8. Tap Allow explicit content to toggle it to the off position. 

On a desktop

  1. Open the Spotify website.
  2. Log in to your account. 
  3. Click on the Premium Family tab on the left hand side. 
  4. In the People on this plan section, select the name of the family member whose account you want to manage.
  5. Click on Allow explicit content to toggle it to the off position. 

Spotify on shared family devices  

If your child uses Spotify on a shared family device, and you don’t want to restrict content for that device, be aware that they may come across material that isn’t appropriate for their age. Each family needs to weigh the pros and cons of restricting content and make the choice that’s right for their household. 

If you allow your child to use Spotify on an unrestricted account, it’s a good idea to have a discussion with them about questionable content they might encounter and monitor their use by keeping an ear out or peeking at the listening history. 

In short 

Both Spotify Kids and Spotify parental controls offer families options to keep their kid’s listening experience age-appropriate. If your child listens to music on other platforms, such as YouTube, make sure you use parental control settings on those websites and apps, too. With BrightCanary, you can monitor YouTube activity directly from your phone. 

While it’s good to let your child develop their own interests (and playlists), a little bit of supervision goes a long way in keeping your child from content they’re not old enough to handle on their own. 

Father and son talking on couch

Since the early days of the internet, parents have worried what their children are up to online, and companies have responded with parental controls to help keep kids safe. But the way we use the internet has changed dramatically since its inception. This shift has ushered in the need for new approaches to parental controls. Read on to learn how we got here and to explore the best parental controls and monitoring apps to protect kids online.

Types of parental controls 

There are four basic categories of parental controls, ranging from settings on your child’s devices to third-party software. 

Content filters

These controls filter out inappropriate content, thereby limiting what your child can access. In the early days of the internet, the only way to filter content was to install third-party software, such as Net Nanny. Now, the option to filter content is built right into search engines. 

Usage controls 

Usage controls include things like screen time limits and blocking access to certain types of apps, such as social media or gaming. Apple Screen Time is a prime example: this free service allows you to prevent your child from making purchases on the App Store without your permission, schedule quiet time for notifications, and more.

Computer user management 

User management tools are software that set different levels of access, depending on who’s using the device. If you log in to your family laptop, you’ll have unrestricted access, while your child’s profile will include limitations. Most computers now have this feature built-in. 

Monitoring tools

Monitoring tools do exactly what the name suggests: monitor your child’s activity online. What they monitor varies widely depending on the tool. For example, Apple’s Find My monitors your child’s location, while an app like BrightCanary monitors your child’s social media, text messages, and Google and YouTube activity.

The early days of parental controls 

Back in the Wild, Wild West of the World Wide Web, the options for parental controls were limited to web filters. In 1994, Net Nanny introduced a browser that filtered web and chat room content, blocked images, and masked profanity. 

While it was revolutionary at the time, these were still the days where using the internet meant sitting at a desktop computer — typically on a shared family device — with the unmistakable pings of the dial-up modem announcing anytime someone was online. 

Since then, a lot has changed about how we use technology. Kids can access the internet from the palm of their hand with smartphones, smart watches, and tablets. We’re always connected, always online, and always dealing with the compulsion to check social media feeds. These changes have introduced new needs for keeping kids safe online. 

The changing needs of parents and kids

Between WiFi, mobile devices, and social media, using the internet looks very different than it did in the early days of parental controls. And things like the advent of algorithms and the introduction of monetizing data means our lives are intertwined with the internet in ways we couldn’t have imagined back in dial-up days.  

So, what do modern parents really need with parental controls? 

  • Products that seamlessly integrate into their digital lives: This has been a challenge because, while the iPhone has become the dominant device among teens, Apple is notoriously guarded when it comes to allowing third-party apps to monitor activity. This means that very few parental monitoring solutions have been designed that make monitoring truly easy for parents with kids who use Apple devices. 
  • Products that complement what they’re already doing: Apple now offers robust parental control settings, and most social media platforms have their own suites of controls. This leaves less need for all-in-one apps like Bark and Qustodio, which can feel clunky and redundant when parents can now customize these settings (for free) directly on their phone. Other apps, such as BrightCanary, fill in the gaps by monitoring what other tools don’t, such as social media feeds.
  • The ability to monitor messages: Gone are the days where parents knew who their kids were chatting with because they could overhear them on the phone or sneak a peek as they sent instant messages on the family computer. Nowadays, kids primarily communicate over text messages and direct messages, not only on computers, but on phones, tablets, and smartwatches — often out of sight of parents. This shifting landscape has introduced new avenues for kids to be exposed to harmful content and requires new ways for parents to supervise their children.  

Modern solutions for parenting in the digital age

BrightCanary allows parents to keep tabs on their kid’s online life wherever and whenever, all from their own phone. They offer the most comprehensive coverage for kids on Apple devices and, unlike other apps, they actually allow parents to see what their kids are viewing online and view their text message conversations. It’s a modern solution for the needs of modern families. 

In short 

What families need from parental controls has shifted in recent years, but many companies have failed to keep up with these changes. BrightCanary offers modern parental control solutions that work for modern families. 

Be the most informed parent in the room.
Sign up for bimonthly digital parenting updates.
Please enable JavaScript in your browser to complete this form.
@2024 Tacita, Inc. All Rights Reserved.