How to Reset Your Child’s Social Media Algorithm

By Andrea Nelson
October 19, 2023
Tween girls taking selfies together

As a parent, you want your child to surround themselves with good influences. That’s true not only for who they spend time with in real life, but also for the people and ideas they’re exposed to on social media. 

If you or your child are concerned about the content appearing in their feed, one beneficial step you can take is to help them reset their social media algorithm. Here’s how to reset your child’s algorithm on TikTok, Instagram, and other platforms.

What is a social media algorithm?

Social media algorithms are the complex computations that operate behind the scenes of every social media platform to determine what each user sees. 

Everything on your child’s social media feed is likely the result of something they liked, commented on, or shared. (For a more comprehensive explanation, check out our Parent’s Guide to Social Media Algorithms.)

Social media algorithms have a snowball effect. For example, if your child “likes” a cute dog video, they’ll likely see more of that type of content. However, if they search for topics like violence, adult material, or conspiracy theories, their feed can quickly be overwhelmed with negative content.

Therefore, it’s vital that parents actively examine and reset their child’s algorithm when needed, and also teach them the skills to evaluate it for themselves. 

Research clearly demonstrates the potentially negative impacts of social media on tweens and teens. How it affects your child depends a lot on what’s in their feed. And what’s in their feed has everything to do with algorithms. 

Talking to your child about their algorithm

Helping your child reset their algorithm is a wonderful opportunity to teach them digital literacy. Explain to them why it’s important to think critically about what they see on social media, and what they do on the site influences the content they’re shown. 

Here are some steps you can take together to clean up their feed: 

Start with their favorite app

Resetting all of your child’s algorithms in one fell swoop can be daunting. Instead, pick the app they use the most and tackle that first. 

Scroll through with them

If your kiddo follows a lot of accounts, you might need to break this step into multiple sessions. Pause on each account they follow and have them consider these questions:

  • Do this person’s posts usually make me feel unhappy or bad about myself? 
  • Does this account make me feel like I need to change who I am? 
  • Do I compare my life, body, or success with others when I view this account? 

If the answer “yes” to any of these questions, suggest they unfollow the account. If they’re hesitant — for example, if they’re worried unfollowing might cause friend problems — they can instead “hide” or “mute” the account so they don’t see those posts in their feed. 

Encourage interaction with positive accounts 

On the flip side, encourage your child to interact with accounts that make them feel good about themselves and portray positive messages. Liking, commenting, and sharing content that lifts them up will have a ripple effect on the rest of their feed. 

Dig into the settings 

After you’ve gone through their feed, show your child how to examine their settings. This mostly influences sponsored content, but considering the problematic history of advertisers marketing to children on social media, it’s wise to take a look.  

Every social media app has slightly different options for how much control users have over their algorithm. Here's what you should know about resetting the algorithm on popular apps your child might use.

How to reset Instagram algorithm

  • Go to Settings > Ads > Ad topics. You can view a list of all the categories advertisers can use to reach your child. Tap “See less” for ads you don’t want to see. 
  • Go to your child’s profile > tap Following > scroll through the categories to view (and unfollow) the accounts that appear most in your child’s feed.
  • Tap the Explore tab in the bottom navigation bar and encourage your child to search for new content that matches their interests, like cooking, animals, or TV shows.

How to reset TikTok algorithm

  • Go to Settings > Content Preferences > Refresh your For You feed. This is like a factory reset of your child’s TikTok algorithm.
  • Go to Settings > Free up space. Select “Clear” next to Cache. This will remove any saved data that could influence your child’s feed.
  • As your child uses TikTok, point out the “Not Interested” feature. Tap and hold a video to pull up this button. Tapping “Not interested” tells TikTok’s algorithm not to show your child videos they don’t like. 

How to reset YouTube algorithm

  • Go to Library > View All. Scroll back through everything your child has watched. You can manually remove any videos that your child doesn’t want associated with their algorithm — just then tap the three dots on the right side, then select Remove from watch history.
  • Go to Settings > History & Privacy. Tap “Clear watch history” for a full reset of your child’s YouTube algorithm.

What to watch for

To get the best buy-in and help your child form positive long-term content consumption habits, it’s best to let them take the lead in deciding what accounts and content they want to see. 

At the same time, kids shouldn't have to navigate the internet on their own. Social platforms can easily suggest content and profiles that your child isn't ready to see. A social media monitoring app, such as BrightCanary, can alert you if your child encounters something concerning.

Here are a few warning signs you should watch out for as you review your child's feed: 

If you spot any of this content, it’s time for a longer conversation to assess your child’s safety. You may decide it’s appropriate to insist they unfollow a particular account. And if what you see on your child’s feed makes you concerned for their mental health or worried they may harm themselves or others, consider reaching out to a professional.  

In short 

Algorithms are the force that drives everything your child sees on social media and can quickly cause their feed to be overtaken by negative content. Regularly reviewing your child’s feed with them and teaching them skills to control their algorithm will help keep their feed positive and minimize some of the negative impacts of social media. 

Woman smiling at phone while sitting on couch

Just by existing as a person in 2023, you’ve probably heard of social media algorithms. But what are algorithms? How do social media algorithms work? And why should parents care? 

At BrightCanary, we’re all about giving parents the tools and information they need to take a proactive role in their children’s digital life. So, we’ve created this guide to help you understand what social media algorithms are, how they impact your child, and what you can do about it. 

What is a social media algorithm? 

Social media algorithms are complex sets of rules and calculations used by platforms to prioritize the content that users see in their feeds. Each social network uses different algorithms. The algorithm on TikTok is different from the one on YouTube. 

In short, algorithms dictate what you see when you use social media and in what order. 

Why do social media sites use algorithms?

Back in the Wild Wild West days of social media, you would see all of the posts from everyone you were friends with or followed, presented in chronological order. 

But as more users flocked to social media and the amount of content ballooned, platforms started introducing algorithms to filter through the piles of content and deliver relevant and interesting content to keep their users engaged. The goal is to get users hooked and keep them coming back for more.  

Algorithms are also hugely beneficial for generating advertising revenue for platforms because they help target sponsored content. 

How do algorithms work? 

Each platform uses its own mix of factors, but here are some examples of what influences social media algorithms:

Friends/who you follow 

Most social media sites heavily prioritize showing users content from people they’re connected with on the platform. 

TikTok is unique because it emphasizes showing users new content based on their interests, which means you typically won’t see posts from people you follow on your TikTok feed. 

Your activity on the site

With the exception of TikTok, if you interact frequently with a particular user, you’re more likely to see their content in your feed. 

The algorithms on TikTok, Instagram Reels, and Instagram Explore prioritize showing you new content based on the type of posts and videos you engage with. For example, the more cute cat videos you watch, the more cute cat videos you’ll be shown. 

YouTube looks at the creators you interact with, your watch history, and the type of content you view to determine suggested videos. 

The popularity of a post or video 

The more likes, shares, and comments a post gets, the more likely it is to be shown to other users. This momentum is the snowball effect that causes posts to go viral. 

Why should parents care about algorithms? 

There are ways social media algorithms can benefit your child, such as creating a personalized experience and helping them discover new things related to their interests. But the drawbacks are also notable — and potentially concerning. 

Since social media algorithms show users more of what they seem to like, your child's feed might quickly become overwhelmed with negative content. Clicking a post out of curiosity or naivety, such as one promoting a conspiracy theory, can inadvertently expose your child to more such content. What may begin as innocent exploration could gradually influence their beliefs.

Experts frequently cite “thinspo” (short for “thinspiration”), a social media topic that aims to promote unhealthy body goals and disordered eating habits, as another algorithmic concern.

Even though most platforms ban content encouraging eating disorders, users often bypass filters using creative hashtags and abbreviations. If your child clicks on a thinspo post, they may continue to be served content that promotes eating disorders

Social media algorithm tips for parents

Although social media algorithms are something to monitor, the good news is that parents can help minimize the negative impacts on their child. 

Here are some tips:

Keep watch

It’s a good idea to monitor what the algorithm is showing your child so you can spot any concerning trends. Regularly sit down with them to look at their feed together. 

You can also use a parental monitoring service to alert you if your child consumes alarming content. BrightCanary is an app that continuously monitors your child’s social media activity and flags any concerning content, such as photos that promote self-harm or violent videos — so you can step in and talk about it.

Stay in the know

Keep up on concerning social media trends, such as popular conspiracy theories and internet challenges, so you can spot warning signs in your child’s feed. 

Communication is key

Talk to your child about who they follow and how those accounts make them feel. Encourage them to think critically about the content they consume and to disengage if something makes them feel bad. 

In short

Algorithms influence what content your child sees when they use social media. Parents need to be aware of the potentially harmful impacts this can have on their child and take an active role in combating the negative effects. 

Stay in the know about the latest digital parenting news and trends by subscribing to our weekly newsletter

Teen girl using Triller app on her phone

If you’re concerned about the safety of TikTok for your child and looking for alternatives, you might come across Triller in your search. But though the app frequently lands itself on lists of kid-safe alternatives to TikTok, it’s not without risk. 

So, is Triller safe for kids? In this article, we’ll take a look at the dangers of Triller, how it compares to TikTok, and why it’s not the safest choice for kids.

Is the Triller app safe for kids?

No, the Triller app is not safe for kids. Unlike TikTok, Triller has no parental controls, no age verification, and allows direct messages from strangers. The app also contains large amounts of inappropriate content, making it unsafe for children and younger teens.

What is the Triller app?

Triller is a short-form video platform similar to TikTok and Instagram Reels. It allows users to create and post videos and view other users’ content. 

Originally intended as a music video app, the platform has a broader range of music to choose from and a more overt focus on fame and gaining followers than other platforms. 

How does Triller work? 

Triller lets users film multiple takes of a video and then use the built-in AI editing tools to automatically select and combine the best clips to generate a slick-looking video. 

Like other social media platforms, Triller users can follow other creators and like and comment on videos. And similar to TikTok, Triller suggests videos for users, but unlike TikTok, which suggests videos based on a user’s watch history, Triller’s Discover page is based around promoted campaigns, top videos, and genre categories. 

Does Triller have parental controls? 

Not only does Triller have no parental controls, but it also lacks any form of age verification.

Why is Triller unsafe for kids?

Although Triller has a few (read, very few) safeguards in place, like the ability to set accounts to private, turn off data collection, and block users, the red-flags are plentiful. 

  • No age verification. Flimsy age verification is a problem for many social media apps (we’re looking at you, TikTok), but Triller doesn’t even pretend to try. 
  • No parental controls. None, zip, zilch, nada.  
  • DMs with no ability to limit messages to contacts. Social media platforms, especially those without the ability to restrict who can message you, are prime spots where predators target children. Triller doesn’t allow users to limit DMs to people they’re connected with, meaning anyone could text your child
  • Tons of inappropriate content. Triller is filled with content that’s inappropriate for kids, such as highly suggestive videos, profanity, and content promoting substance use. 
  • Location of videos can be revealed. Users can reveal the location where their content was filmed, opening kids up to serious safety concerns. 

Triller vs. TikTok: Which app is safer for kids?

All things considered, Triller is much less safe than TikTok. Here’s how the apps stack up: 


Feature
TikTokTriller
Parental controlsYesNo
Age verificationYes, easy to bypassNo
Limit direct messagesYesNo, must block users individually
Content moderationYes, but explicit content slips throughYes, but explicit content slips through
Explicit materialProhibited but commonProhibited but common
Community GuidelinesYes, but not kid-focusedYes, but not kid-focused

Final word: Is Triller safe for kids? 

Triller is a video-sharing app that has its sights set on competing with TikTok. But Triller is not safe for kids, including younger teens. With inappropriate content, a lack of parental controls and age verification, and no ability to limit who can message you, Triller is only appropriate for users 17 and older. 

BrightCanary helps parents monitor what their children type and search on the apps they use the most, including Triller and TikTok. Download today to get started for free.

Instagram Map screenshots on phones

Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:

  • The new Instagram Map allows users to share their location with followers. Here’s how to keep your child’s location private.
  • TikTok announced updates to Family Pairing, the platform’s parental control suite.
  • YouTube will use AI to tell if users have lied about their age to bypass age-based content restrictions. But is it enough?

Digital Parenting

📍 Teens can now share location with friends on Instagram: The new Instagram Map allows users to share their location with friends and see content people are posting from different places. The feature is similar to Snapchat’s Snap Map … and has similar privacy concerns: if your child isn’t picky about who follows them on Instagram, they might end up sharing their location with strangers. Big yikes. 

Location sharing is off unless you opt in, and if you’ve set up parental supervision on your teen’s account, you’ll get notified if your teen starts sharing their location and see who they’re sharing their location with. 

The fact remains, though, that this feature opens up more risks than rewards, and we recommend talking to your teen about why it’s better to keep their location private (and only share their location with you), rather than broadcasting their personal info to their entire friends list.

👀 TikTok introduces new parental controls: TikTok’s parental control suite is getting an upgrade just in time for the school year. TikTok Family Pairing allows parents to set content and communication limits for their teen. Soon, Family Pairing will also automatically notify a parent when their teen uploads anything that’s visible to others on TikTok, and it will show parents what privacy settings and topics their teen has selected to shape their feed. 

If your child uses TikTok and you haven’t already set up Family Pairing, now is the time — but pair it with regular check-ins and conversations about what they see on their For You page. (Psst: For extra protection, BrightCanary monitors what your child types on all the apps they use, including TikTok comments and messages.)

▶️ YouTube will use AI to tell if your child fibbed about their age: YouTube is cracking down on age verification. YouTube has different experiences for kids and adults, including age-restricted protections like disabled personalized advertising and limiting repetitive views of some content — but it’s relatively easy for kids to fib about their age. Soon, YouTube will use machine learning to determine if a user is an adult or not.

The AI will analyze a bunch of signals, including the types of videos a user searches for and how old the account is. For users who are incorrectly flagged as kids (or kids at heart), they’ll have the option to verify that they’re older than 18 using a government ID or other form of identification.

So. If everything works as it should, kids won’t be able to watch age-restricted content. Of course, that’s a big “if.” Continue to supervise your child’s YouTube use. Talk about the content they see and what to do if something makes them feel uncomfortable. You can use BrightCanary to monitor their YouTube account and keep track of what they watch, including the ads they see. 

📱 Webinar: How to make social media as safe as it CAN be: Lots of social media updates this week, but that’s because social media platforms constantly update their features and protections (or lack thereof). That’s why we’re happy to share “Safe Social Media?!?” by Digital Mom Media and Healthy Screen Habits. 

In this webinar, you’ll learn everything you need to make social media as safe as possible — how to get started, what to know about popular platforms, and how to build a sustainable social media plan. Learn more and get your tickets today.


Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.


Tech Talks

We’re officially in back-to-school season. Cue the new backpacks, earlier bedtimes, and the return of juggling group chats and the social dynamics of the school hallways. The good news: A few honest conversations now can help your child start the year with healthy habits and a little more confidence. Here are conversation-starters to help you talk about managing stress, balancing screen time, and handling social pressures:

  1. “What’s your favorite way to take a break when school feels overwhelming?”
  2. “If your friends are chatting online late at night, how will you handle it so you still get enough sleep?”
  3. “Do you think social media changes how people act at school? How so?”
  4. “Are there certain apps or chats you think might be distracting during the school week? Should we set limits together?”
  5. “What’s one healthy habit you want to start this school year — and how can I support you in it?”

What’s Catching Our Eye

😬 Heads up if your teen is fueled by energy drinks: High Noon is recalling some vodka seltzers mislabeled as Celsius energy drinks.

⚠️ Minnesota is set to include warning labels on social media sites. Agree or disagree?

🚫 “ … YouTube cannot be trusted — the algorithm leads you very quickly down weird or concerning rabbit holes and the messaging in a video is often subtle and hard to screen as a parent without pre-watching everything (which is not realistic).” The Guardian surveyed parents about how their kids use YouTube, and the responses paint a picture of YouTube’s major pros and cons.

Group of students looking at phone at school

A new school year doesn’t just mean backpacks and bus rides. It often comes with a new phone or tablet, new social circles, and more opportunities for unsupervised time online. That’s a perfect storm for kids to encounter digital risks, many of which parents never see coming.

At BrightCanary, we use real-time monitoring and AI to help parents understand what their kids are actually doing online. Based on what we’ve seen kids searching for and typing into their devices, here are five digital dangers to watch for this school year, broken down by age.

1. Mental health struggles

What we’ve seen: We’ve flagged messages and searches from kids expressing feelings of sadness, isolation, anxiety, and hopelessness, often late at night.

What it is: Algorithms on platforms like Instagram, TikTok, and YouTube tend to serve up more of what a child interacts with. That means one post about depression can quickly spiral into a feed full of triggering or glamorized content about self-harm, eating disorders, or suicidal ideation. As many as 41% of girls see suicide-related content on Instagram every month.

What’s worse: Kids often hide this behavior. We’ve seen searches phrased in vague or coded terms, like “unalive” or “d13” for feelings about suicidal ideation.

How it can show up by age: 

  • Elementary: Googling questions about sadness
  • Middle school: Messaging friends about anxiety or body image
  • High school: Watching content that normalizes depression or searching for ways to self-harm

2. Grooming and exploitation

What we’ve seen: We’ve flagged instances of teens having explicit conversations with adults online.

What it is: Online grooming happens when a predator builds trust with a child in order to exploit them sexually. These interactions often start in social apps or games with chat features, like Roblox, Reddit, or Minecraft.

Predators are skilled at targeting vulnerabilities. They may compliment your child or offer support to get them on their side. If your child has a public profile or uses apps without strict age restrictions, they’re more exposed.

How it can show up by age:

  • Elementary: Chat features in YouTube, gaming apps like Roblox
  • Middle school: Discord, Reddit, and public DMs on social media
  • High school: Flirting with strangers on apps like Snapchat or Instagram

3. Drugs

What we’ve seen: We’ve flagged conversations where teens mention drug slang, ask where to buy alcohol, or joke about using drugs. There have also been reports about drug dealers using social media platforms to arrange sales.

What it is: From vaping in the bathroom to buying drugs on Snapchat, substance use is rampant in schools. Kids can be exposed to drug-related content earlier than many parents realize, whether it’s memes glamorizing weed, peers using slang like “gas” or “snow,” or influencers normalizing casual drinking. 

Some dealers even use secret messaging apps to advertise and arrange drug deals, making it harder for parents to spot the warning signs.

How it can show up by age:

  • Elementary: Asking what a slang term means (like “edibles”) after hearing it online
  • Middle school: Searching drug terms or watching content that glamorizes substances
  • High school: Messaging about trying or buying substances

4. Cyberbullying

What we’ve seen: We’ve caught messages where kids were told to “KYS” (“kill yourself”), targeted in toxic group chats, or even initiated bullying via direct messages. 

What it is: Cyberbullying takes many forms, like name-calling, exclusion, harassment, doxxing, or spreading memes at someone’s expense. As many as 90% of teens have been bullied online, so it’s more common than you might think. And because it happens in group chats, DMs, and disappearing messages, it often flies under the radar.

How it can show up by age:

  • Elementary: Multiplayer games with chat features
  • Middle school: Bullying in DMs or group chats
  • High school: Harassment via Snapchat or anonymous accounts on social media

5. Over-sharing of personal information

What we’ve seen: Kids sharing their full names, locations, and school names in chats, often without realizing how dangerous that can be.

What it is: Many kids don’t think twice about sharing personal details online. But doing so can expose them to predators, online scams, and doxxing. Even casual details in an Instagram caption can help someone track them down in real life or use their personal details against them.

How it can show up by age:

  • Elementary: Typing their school name or location in a comment
  • Middle school: Sharing their location or Snap Map access
  • High school: Posting schedules, addresses, or travel plans

What parents can do

1. Stay informed

It’s hard to protect your kids from what you don’t understand. Stay up to date on emerging apps, trends, and online behaviors (that’s what we’re here for).

2. Keep the conversation going

Talk regularly with your child about what they’re seeing online. Use open-ended questions, and avoid jumping to conclusions. The goal is connection, not control. 

3. Set tech boundaries

Make clear rules around screen time, privacy settings, and what’s okay to share. Revisit these rules as your child gets older and their digital life evolves.

4. Use monitoring tools

BrightCanary gives you real-time summaries of what your child types online. Our keyboard-based monitoring works across every app they use on their iPhone or iPad, even if they use incognito mode or secret accounts.

5. Step in when needed

If something feels off, trust your gut. You can reach out to teachers or school administrators, report behavior on social platforms, and contact authorities if there’s a serious risk.

The bottom line

Back-to-school season isn’t just about pencils and planners; it’s a key moment to check in on your child’s digital life. The online world they navigate is complex and ever-changing. But with awareness, conversation, and the right tools, you can help them stay safer this school year.

BrightCanary makes it easier. Our AI-powered monitoring alerts you to red flags and gives you a clearer picture of your child’s digital world so you can parent with confidence.

Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:

  • Introducing the BrightCanary Keyboard: a new and smarter way to keep your child safe across all the apps they use.
  • How often does your teen pick up their phone in a day, and how might that impact their mood?
  • Instagram direct messages now have enhanced safety features for minors, but only if your child signed up with a Teen Account.

Digital Parenting

📱 The keyboard that helps you see what your child types across all apps: Kids don’t just chat on one app. They’re messaging on Instagram, searching on Google, and talking to friends on Discord simultaneously and while eating all the food in your fridge. With the new BrightCanary Keyboard, you can monitor typed activity across any app — from Discord to Roblox chats. With AI-powered insights and real-time concerning content alerts, you can stay connected and informed, without reading every word. Plus, you can still monitor text messages and Google + YouTube searches with the BrightCanary app. This feature is available now, and if you currently use BrightCanary, you’ll see an update in the coming weeks. Learn how it works and how to set it up.

😩 Does the number of times teens pick up their phones influence their mood? If you have an iPhone, you can check how often you pick up your smartphone during the day: Go to Settings > Screen Time > See all app and website activity > Pickups. You can even see which apps you checked most often after picking up your phone. That’s interesting (and potentially harrowing, depending on how often you pick up your phone), but how does this impact teens? Researchers studied a group of teens over two weeks and found that teens picked up their phones an average of 112.6 times per day. Nearly half (49%) of pickups were for Snapchat, followed by Instagram (13%) and messages (12%). They also found:

  • Teens who felt more negative emotions tended to check their phones more frequently.
  • When a teen had a bad day, they picked up their phone more often the next day, but only if they had lower mindfulness skills. 

Over at Techno Sapiens, Jacqueline Nessi, PhD, one of the authors on the study, notes that the smartphones didn’t seem to impact the teen’s moods. Rather, kids who look at their phones more often might see things that make them upset, or kids who already have variable emotions use their phones to feel better, among other factors.

If your teen is using their phone to help cope with their emotions, it’s important to pay attention to what type of content they’re consuming. Are they talking to a friend or family member, or are they doomscrolling on Instagram? Staying involved can help you teach your child better ways to regulate their emotions, especially if their social media algorithm is making them feel worse. (If that’s the case, here’s how to reset their algorithm together.) 

🔒 Meta enhances direct messaging protections for teen users: New safety features in Instagram DMs will allow teens to see more information about who they’re chatting with, like when the account was created and important safety tips to spot potential scammers. Teens can also block and report accounts directly from DMs. It’s an unfortunate truth that teens can be scammed by phishing attempts, catfishing, and more, directly from their inbox. Instagram Teen Accounts automatically limit who can message them, so they can only be messaged by people they follow. The trick is making sure that your teen signed up for Instagram with their correct birthdate — the enhanced safety features only work if Instagram recognizes that your teen is, well, a teen. If your kiddo uses Instagram, double-check their privacy settings and talk to them about why it’s important that they use the strictest privacy settings possible and only talk to people they know in real life. 

Did you know? With the new BrightCanary Keyboard, you can monitor what your child types in their Instagram DMs and get real-time alerts when BrightCanary identifies red flags, like self-harm or drug references.


Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.


Tech Talks

Your child is typing more than they’re talking. How can you help them stay safe in every corner of the internet, even the ones you don’t check daily? The BrightCanary Keyboard gives parents a new window into the digital world their kids are navigating. Here’s how to talk about it together.

  1. “Did you know your phone’s keyboard can help keep you safe online? Want to learn how it works together?”
  2. “A lot of the apps you use have different safety features that aren’t always designed to keep kids safe. Here’s how BrightCanary helps us.”
  3. “Have you ever talked to someone online that made you feel uncomfortable?”
  4. “It’s a good idea to set time limits around when you use your phone. I see you’re using your phone overnight, but sleep is really important. Why don’t we try leaving your phone to charge in the kitchen overnight, instead of your room?”
  5. “Having an iPhone or iPad is a really big responsibility, and strangers online aren’t always going to have your best interests at heart. BrightCanary is one of our rules for having your own device.”

What's Catching Our Eye

🫠 Social media slang is reshaping the way we speak. Does all the new slang today make you feel so delulu? The way kids communicate online changes fast, fueled by new trends online and in the media they consume. CBS reports on how platforms like TikTok are transforming modern language. (Heads up: With the BrightCanary Keyboard, you’ll be able to see what your child types, even when they’re using slang or coded language.)

😬 Elon Musk announced that his AI startup xAI will make an app dedicated to kid-friendly content and call it Baby Grok, which, y’know, isn’t ideal, given the company’s history with content moderation and lack of safeguards for younger users. xAI’s chatbot Grok has a history of sharing problematic content online, to say the least.

🍬 Are rewards for children doing more harm than good? Research suggests that rewards for things kids wouldn’t typically do on their own (like cleaning their room) are fine because they can help “jumpstart” extrinsic motivation — kinda how you might give yourself a little treat after putting away laundry or finally taking the car to get serviced. Read more at the very excellent Parenting Translator.

Two girls looking at TikTok alternatives

TikTok is mega popular with kids, but it’s also super risky. We looked at five apps to see if they are safe versions of TikTok for kids. We based our decision on how actively the apps moderate content and prevent concerning material for minors, how robust the parental controls are, and whether kids can easily get around age verification measures.

TikTok Alternatives for KidsOur Verdict
Coverstar👍 Give it a try!
Zigazoo👍 Give it a try!
Marco Polo👍 Give it a try!
TikTok Under 13 Experience👎 Steer clear!
Triller👎 Steer clear!

Coverstar

Coverstar is a short-form video social media app that promises a safe, supportive alternative to TikTok for tweens and teens.

Coverstar pros: 

  • Takes active steps to prevent bullying and explicit material. 
  • Moderated content (human and AI). 
  • No DMs.
  • Strict community guidelines. 
  • Easy reporting tools. 
  • Privacy controls. 
  • Parent permission required. 

Coverstar cons: 

  • No age verification. 
  • Ads targeted at kids.
  • No parental controls. 

Coverstar verdict: 

👍 Give it a try! 

Concerns about a lack of parental controls are eased by the fact that the app is inherently quite safe. And although there’s no age verification, the parental consent requirement is robust.

Zigazoo

Zigazoo is a TikTok-style social media platform centered around age-appropriate video challenges. 

Zigazoo pros: 

  • Login is linked to parents, and parent consent is required to set up an account (and confirmed through photo recognition.)
  • Settings are locked by parents at the time an account is created. Parents can also log in directly to view and manage accounts.
  • Strong age verification through photo recognition.
  • Strict community guidelines and human content moderation. 
  • All videos must be approved by moderators, and videos are age-verified.
  • No direct messaging.
  • No text comments allowed on videos.
  • Sticker and gift-giving is moderated for positivity and appropriateness.

Zigazoo cons: 

  • No global privacy settings.
  • Parental permission is easy to bypass for in-app purchases.

Zigazoo verdict: 

👍 Give it a try! 

We love Zigazoo’s focus on positivity and thoughtful approach to user safety. Their strong age verification process, required parental consent, and human moderation helps keep your child safe on the app.

Did you know? BrightCanary monitors your child’s activity across all the apps they use, including the apps on this list. You’ll get insights into what they type, their emotional well-being, summaries of their activity, and more.

TikTok Under 13 Experience 

Kids under 13 who sign up for TikTok will be automatically directed to the Under 13 Experience, which is designed and curated with younger users in mind. 

TikTok Under 13 Experience pros: 

  • All accounts are private.
  • Users can create but not post videos.
  • App use is limited to 1 hour.
  • TikTok partners with Common Sense to curate the For You and Discover feeds with age-appropriate videos. 
  • Users can’t comment on or share videos.
  • No DMs.
  • Parental controls.

TikTok Under 13 Experience cons: 

  • Flimsy age-verification process. 
  • Easy to create a second account and bypass age restrictions. 
  • Risk of addiction is well documented, including by the company

TikTok Under 13 Experience verdict: 

👎 Steer clear!

We like the Common Sense-curated feed, private accounts, and the fact that users can’t post videos. But these pros are overshadowed by some pretty big cons. Chief among them is the fact that, once parents have okayed downloading the Under 13 Experience, kids can easily create a second account and bypass TikTok’s flimsy age verification. Plus, we really don’t trust TikTok’s algorithm

Triller

Triller is a video-based social media platform for people over 13 that lets users browse, create, and post videos. 

Triller pros: 

  • Accounts can be set to private. 
  • Can turn off data collection.
  • Ability to block users.

Triller cons: 

  • No age verification. 
  • No parental controls 
  • DMs with no ability to limit messages to contacts. 
  • Users can directly pay each other. 
  • Tons of inappropriate content. 
  • Location of videos can be revealed. 

Triller verdict: 

👎 Steer clear!

We’re not sure how Triller seems to show up on every list of safe TikTok alternatives for kids. It has red flags all over it. This app isn’t even suitable for young teens. 

Marco Polo

Marco is a video chat social media app, with a turn-taking format and the ability to send messages to groups of up to 200 people. Users have to be 13 to use it.

Marco Polo pros: 

  • No public feed. 
  • No ability to connect with strangers.
  • Can turn off the feature that notifies contacts a user is online. 
  • Users can restrict others from downloading or forwarding videos they send. 

Marco Polo cons: 

  • No age verification. 
  • No parental controls. 

Marco Polo verdict: 

👍 Give it a try!

While it doesn’t mirror the TikTok experience to the same degree some of the other apps do, Marco Polo is a good option with strong privacy settings. If your teen wants to share videos, but you only want them to be able to chat with people they know, give Marco Polo a try.   

Wrap-up: What TikTok alternatives are safe for kids?  

We reviewed five TikTok alternatives to see if they’re safe for kids. Coverstar, Zigazoo, and Marco Polo all get the thumbs up from us, while Triller and TikTok Under 13 Experience get a thumbs down. 

To keep an eye on what your child is doing online, try BrightCanary. The app uses advanced technology to help you monitor your child’s activity on the apps they use the most. Download today to get started for free.

If your kids love the cute animal videos and lip sync battles on TikTok, but you don’t love the inappropriate content and exposure to strangers, Coverstar presents an interesting alternative. 

The app bills itself as “The Safe TikTok Alternative” and promises a kid-friendly experience. But is Coverstar safe for kids or not? We investigated how it works, what parental controls are offered, and how it stacks up against TikTok. 

What is Coverstar?

Coverstar is a short-form video social media app designed for tweens and teens. It promises a safe and supportive environment for kids in the following ways:

  • Preventing bullying and explicit material
  • AI and human content moderation 
  • No direct messaging 
  • Strict community guidelines 
  • Reporting tools 
  • Privacy controls 
  • Parental consent required for users under 13 

How does Coverstar work? 

Similar to TikTok, Coverstar lets users create, share, and watch short videos with the option of adding music, sound effects, voice-overs, and visual effects like filters and masks. Users can create and participate in challenges, and the app is often swept by viral trends. 

Does Coverstar have parental controls? 

Despite requiring parental consent for users under the age of 13, there aren’t any parental controls. 

The app's design reduces some of the need for parental controls, but parents are unable to do things like set time limits. It also means kids can change their privacy settings without parental permission. 

Should I be concerned about my child using Coverstar?

Although Coverstar is generally safe for kids, all social media comes with risks. Here’s what to watch out for: 

  • Self-esteem issues. Social comparison and chasing likes are hard to avoid when using apps like Coverstar. This can lead to self-esteem issues in kids. 
  • Marketing. Coverstar allows games and other apps to market to kids on the platform.
  • Cyberbullying. Coverstar takes bullying seriously. However, kids can be really sneaky, especially if they’re using sneaky ways to get around Coverstar’s human and AI moderators.
  • Addiction. Coverstar’s algorithm isn’t as infamously addictive as TikTok, but all social media is designed to be addictive, regardless of the platform. 
  • Predation and catfishing. With no age verification, adults can easily pose as kids. A lack of private messages mitigates much of the concern, but adults could still engage with your child in the comments section and lure them into another app to message privately. 

Is Coverstar safer than TikTok? 

All things considered, Coverstar is a much safer option than TikTok. Here’s how the two apps stack up:

FeatureCoverstarTikTok
Parental controlsNo, but the app's built-in protections reduce most reasons for parental controls.Yes
Direct messagingNoYes
Comments on videosYes, but moderatedYes
Content moderationRobustYes, but plenty of explicit and unsafe material sneaks through
Community guidelinesStrict and geared toward keeping kids safeYes, but not geared specifically toward kids
Explicit materialProhibited and strictly monitoredTechnically prohibited, but frequently found on the app
BullyingProhibited and strictly monitored, but can sometimes still occur in commentsBullying is prohibited and monitored, but far less strict than Coverstar
PredationProhibited and monitored, but adults can still pose as kidsComments and direct messages are risky for kids
Reporting toolsYesYes
Search featureNo search feature, limiting accidental exposure to inappropriate materialYes
Age verificationParental permission required for users under 13, but no age verification when signing upYes, but easy to bypass

How can I help my child use Coverstar safely? 

Here are some ways to help make Coverstar’s experience even safer for your child: 

  • Check their settings. Make sure their account is set to private, and periodically spot-check to ensure they haven’t changed it. 
  • Talk to them about online privacy. Explain the risks of oversharing and the importance of not putting any private information online. 
  • Discuss stranger danger. Educate them on the dangers of interacting with strangers online and teach them to spot the signs of grooming
  • Monitor their use. Don’t rely solely on Coverstar’s moderators to keep your child safe. Periodically sit down to look at their account together, use a monitoring app like BrightCanary to keep an eye on what they’re typing on the app. 

Final word

Coverstar positions itself as “The Safe TikTok Alternative,” and the app largely lives up to that promise. Strict community guidelines, robust moderation, and no direct messages all add to the safety of the app. 

But no social media is without risk. Parents should discuss the dangers of social media with their child and stay involved in monitoring their use.

BrightCanary helps parents monitor what their children type and search on the apps they use the most, including Coverstar and TikTok. Download today to get started for free.

Teen girl using Tumblr on her phone

Fellow Millennial parents might assume Tumblr has gone by the wayside with other early-2000s social media sites like MySpace and LiveJournal. You might be surprised to learn the microblogging platform is enjoying a major resurgence, fueled by Gen Z. But is Tumblr safe for kids? 

This guide discusses why kids like Tumblr, its risks, and what parents can do to help keep their child safe on the app. 

What is Tumblr?

Launched in 2007, Tumblr is a cross between a social media platform and a microblogging site. Users can create blogs and share them with friends and followers either on the Tumblr app or on other social media platforms. 

Tumblr blogs span from fanfiction to art to memes, and everything in between.  

Why is Tumblr so popular with kids? 

The younger generation is flocking to Tumblr in record numbers. A whopping 50% of users are Gen Z. Here are some of the many reasons Tumblr is so popular with kids: 

  • Fandom. Kids use Tumblr to follow writers, artists, and other creatives they’re drawn to. 
  • Creative expression. Many kids turn to Tumblr as a way to showcase their own creativity. 
  • Community. Tumblr recently introduced Communities, group spaces where users can connect with others who share similar interests. 
  • Exploring interests. Because Tumblr is organized around interest-based communities, like writing or drawing, it’s a way for kids to deep dive into their passions. 
  • Real-time feedback. Tumblr’s feed feature gives users real-time feedback on what they post in the form of likes, replies, and messages.
  • Acceptance. Tumblr’s active LGBT+ community gives young users a space to explore their identity and learn from others. 
  • Their parents aren’t there. Tumblr has been called “The one social media millennials didn’t ruin.” (Ouch.) 

What are Tumblr’s age limits? 

Like other social media platforms, users in the US must be at least 13. However, age verification relies on users self-reporting, so it’s very easy to subvert.

Does Tumblr have parental controls? 

Tumblr has zero parental controls, so they get a big ol' F on this metric. 

The App Store rates Tumblr 17+, and Common Sense Media advises it shouldn’t be used by kids under 15. 

Is Tumblr safe for kids? 

Tumblr poses significant risks for kids, including: 

  • Inappropriate content. Pornography, racy content, adult language, and depictions of drug use can all be easily found on Tumblr. There is a Safe Mode, but with no parental controls, it can’t be locked, so kids can turn it off. 
  • No private account option. All Tumblr accounts are public, meaning anyone can follow your child. 
  • Exposure to strangers. Public accounts plus a direct messaging feature open kids up to catfishing, scams, and predatory behavior
  • Dangerous rabbit holes. If your child clicks on something problematic (even by accident), the algorithm will start suggesting similar content. This can lead kids into the darker corners of Tumblr filled with hate speech, misogyny, and violent rhetoric. 

How to keep your child safe on Tumblr

Here are some actions you can take to make your child’s experience on Tumblr safer:

  • Educate them on the risks. Talk to your child to be on the lookout for danger on Tumblr, including how to spot grooming and the dangers of addictive algorithms.
  • Help them adjust their settings. Show your child how to use Tumblr’s privacy and safety settings. We recommend limiting their account’s discoverability, blocking users they don’t follow from messaging them, setting up filters, hiding mature content, and hiding problematic topics like addiction, violence, and sexual themes.
  • Teach them what to do if they encounter a problem. Show them how to report content and encourage them to also let you know if they encounter any problems on the platform.  
  • Stay involved in their usage. One of the most effective steps you can take to keep your child safe on any platform is to stay involved. You can do this by following them (dust off your 2010-era account and hit that Follow button), practicing safety check-ins, and discussing their experience together. 
  • Using a monitoring app. BrightCanary monitors what your child types on all the platforms they use on their iPhone or iPad, including Tumblr. If they’re searching for concerning material or messaging anything concerning, you’ll be able to see it on the app. 

Final word

Tumblr offers a creative space for users to gather around shared interests. However, the lack of parental controls, public accounts, and exposure to problematic content make the platform unsafe for younger teens. Kids under 15 shouldn’t be allowed to use Tumblr, and parents should take an active role in protecting their child on the app. 

For parents who take online safety seriously, BrightCanary offers the most comprehensive monitoring on Apple devices. Monitor what your child sends on all the apps they use, including Tumblr, Discord, and even text messages. Download today and get started for free. 

Teen girl taking a selfie for social media outside

Is your teen begging to start an Instagram or Snapchat account? Introducing kids to social media is a big deal because it can expose them to the broader digital world — and all the risks associated with it. 

In this article, we’ll discuss how to introduce kids to social media and tips for helping them stay safe.

What’s the right age for introducing kids to social media? 

There are two primary factors to consider when deciding if your child is ready for social media: age and maturity. 

Aside from a handful of apps designed for younger kids, such as Kinzoo and Messenger Kids, most social media platforms require users to be at least 13 years old. However, just because your child is technically old enough doesn’t mean they’re automatically ready for Snapchat (or TikTok, Instagram, or any of the other platforms). 

If your 15-year-old isn’t mature enough for social media, you shouldn’t feel pressure to let them use it. But don’t keep them in the dark just because they’re not ready yet — it’s a good idea to start educating your child on how to safely use social media before you hand them the reins.

How do you introduce kids to social media? 

Once you've decided it’s time to let your teen use social media, here are some tips to get them going: 

Start small 

  • Pick one platform to start with. Consider what platform their friends are on, the age-appropriateness of that platform, and what you’re most comfortable with. 
  • Add on slowly. Only allow your child to join a second social media platform once they’ve proven they can responsibly handle the first one. 

Educate them on the risks 

Explaining the risks of social media shows your teen why it’s important to behave responsibly online. It also helps them learn to spot danger — an important ingredient for lowering their risk. 

We’ve covered many of these dangers, including:

Teach them how to stay safe 

We often think of teens as inevitably drawn to risk, but studies actually reveal that teens are often more cautious than their younger peers, choosing the safer option when given the information needed to make that choice. 

To equip your teen with the ability to make safe choices on social media, teach them about: 

What can I do to keep my teen safe on social media? 

Think of these tips as starting points. You’ll want to continuously check in with your child once they start using social media on their own.

Set limits

As your child matures, it may be reasonable to give them increasing leeway in when and how often they use social media (within reason). But when they’re first starting out, it’s a good idea to create more stringent boundaries to help them learn appropriate limits. 

Lock it down 

Utilize the parental controls on the social media apps your child uses, as well as any built into their device. 

Stay involved 

The American Psychological Association recommends that parents monitor social media for all kids under 15, and depending on your child’s maturity level, it may be necessary to do so for longer. Here are some ways to stay involved: 

  • Follow them. Not on the social media platforms your child uses? Time to open an account! Following them on social media won’t tell you everything you need to know, but it’s a good place to start. 
  • Practice digital safety check-ins. Establish designated times when you sit with your child to look at their device together and discuss their online activity
  • Use a monitoring app. BrightCanary uses AI technology to remotely monitor your child’s social media (as well as other online activity) and alerts you if there’s an issue so you can follow up.
  • Build independence over time. The ultimate goal is to raise a young adult who knows how to use social media wisely. As your child matures and proves they can act responsibly online, increase their autonomy. Don’t totally check out, though. 

Did you know? BrightCanary is a great way to give your child independence without compromising on safety because you get alerts when there’s a red flag … without having to look at everything your child does online. 

In short

By being proactive, parents can introduce social media to their child in a way that encourages them to be responsible and stay safe. Parents should educate their child on the risks of social media, teach them tips for staying safe, and remain involved in their child’s online activity. 

BrightCanary gives you real-time insights to keep your child safe online. The app uses advanced technology to monitor them on the apps and websites they use most often. Download on the App Store today and get started for free

Instagram logo iconFacebook logo icontiktok logo iconYouTube logo iconLinkedIn logo icon
Be the most informed parent in the room.
Sign up for bimonthly digital parenting updates.
@2024 Tacita, Inc. All Rights Reserved.