Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
What if the next time your child signs up for a social media platform, they’re faced with a health warning — the same kind of label you see on cigarettes?
Surgeon General Dr. Vivek Murthy recently called for a warning label that states social media is associated with significant mental health harms for adolescents. The statement comes after Murthy issued a health advisory in May 2023, warning that social media is contributing to the youth mental health crisis.
What is a warning label? You’ve likely seen these labels on tobacco and alcohol products. A surgeon general’s warning label is a public statement that calls attention to a critical public health issue.
Warning labels can’t be implemented without congressional approval, but Murthy’s statement furthers a growing movement for regulation on social media to help keep kids safe and minimize the dangers of addictive design features. For example, New York recently passed a measure that bans social media platforms from algorithmically recommending content to children.
It’s not over: Murthy acknowledges that a warning label, on its own, wouldn’t make social media safer for young people. He also urges legislators to:
“There is no seatbelt for parents to click, no helmet to snap in place, no assurance that trusted experts have investigated and ensured that these platforms are safe for our kids,” he wrote. “There are just parents and their children, trying to figure it out on their own, pitted against some of the best product engineers and most well-resourced companies in the world.”
Parents can help, too — by creating more phone-free experiences at home and at school, supervising kids’ social media use, and delaying giving kids access to phones until after middle school. Stay involved, ask questions, and understand what your child is doing on their devices.
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
For today's kids, digital literacy comes into play with everything from school projects to social media. When kids are skilled in digital literacy, they’re more capable of identifying reputable information and sources. Here’s how to raise digitally literate kids.
Smishing — phishing’s younger sibling — is an increasingly common form of cyberattack and one parents need to know about so they can help their kids stay safe. But what is smishing? Read on to learn what this scam entails and how to prevent it from happening to your child.
You know you should talk to your child about what they’re doing on their phone, but it can feel awkward and intrusive. Here are some ways to start the conversation:
🎮 Is your child developing an unhealthy relationship with video games? Melanie Hempe of ScreenStrong shares a video game addiction test you can use today.
🔨 Apple recently announced a fix to a problematic Screen Time bug that allowed kids to view explicit content. (If your child is getting around Apple Screen Time, here are some troubleshooting tips.)
🎉 BrightCanary is now free for school teachers, counselors, and mental health professionals! Learn more in this letter from our CEO Karl Stillner.
Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
New research conducted by the NORC at the University of Chicago investigates how social media affects teen mental health, and the results are a mixed bag. The study included 1,274 teens and young adults aged 14 to 22.
Key findings:
What this means for parents: Lead researcher Amanda Lenhart suggests keeping communication open and encouraging teens to be aware of their emotions while using social media. Ask questions like, “How am I feeling right now? Did I see anything that made me feel sad?”
Parental involvement is crucial. Many young adults wish their parents had delayed their social media use. By staying engaged and setting clear rules, you can help your teens safely navigate the digital world.
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
A recent study published in Frontiers in Child and Adolescent Psychiatry reveals that any form of parental distraction, whether digital or not, negatively affects parent-child interaction.
You may have heard about technoference (when devices interrupt family time). This study asked, “Are the screens the problem, or is it the fact that the parent is distracted?”
In the study, 50 parent-child pairs (with kids around 22 months old) were split into three groups. Group one had no disruptions. Group two parents were asked to stop play time to fill out a paper questionnaire, while group three parents used a tablet. The kids didn’t care whether the distraction was digital or analog. All distractions equally upset them, hurting the quality of interaction.
Big picture: Principal investigator Nevena Dimitrova said the screens themselves aren’t the problem. “Instead, it seems to be the fact that the parent is not fully engaged in the interaction that negatively impacts parent-child communication.”
Said another way, it’s not your iPhone’s fault — it’s the distraction. Want to boost your bonding time? Try minimizing distractions (easier said than done, we know). Put away your phone or set aside non-digital tasks when you’re having one-on-one time with your family. Giving your full attention can do wonders for your child’s emotional health and development.
The end of the school year is coming fast. If you’re a parent of tweens looking for ways to beat the summer boredom blues, we have you covered with this list.
Every child matures differently, and that’s especially true for devices. If you’re monitoring your child’s texts, the way you approach that supervision will change as they age. Here’s what you need to know.
The start of summer also means your child will have way more time for screens. Here are some conversation-starters to help manage their screen time:
📵 What happens when you don’t give kids phones until high school? The Cut profiles several teens whose parents delayed giving them devices, how that decision impacted their social lives, and how they use devices today.
🤔 Parents, we want to know: How do you handle summer screen time? Reply and share your thoughts! We’ll share a few responses in a future issue.
🔐 Did you know that the BrightCanary app has an easy-to-use way to store all your child’s passwords in one place? Learn more about the Password Vault — a free feature available in the BrightCanary app!
Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
Every other week, we see new headlines about the damaging effects smartphones and social media have on our children’s mental health. But what about practical solutions and tips for parents? Those get less attention. Until now.
Two recent studies reveal the critical role parents play in promoting healthy tech habits. Spoiler: the findings indicate that battling unhealthy tech boils down to communication and rule-setting, not ruling with an iron fist or spying on your kid.
A study published in JMIR Pediatrics and Parenting explored how digital interventions can help fight internet addiction (IA) in young children. IA is a behavioral disorder defined as excessive and uncontrolled use of the internet and digital devices. The study involved interviews with 28 parents of children aged 7–11 in Indonesia, along with child therapists. Why Indonesia? The risk of internet addiction tends to be higher in lower-income regions with lower quality of life, and the country has a high prevalence of IA, particularly among children.
The second study, published in Addictive Behaviors, examined strategies to reduce problematic smartphone use (PSU) among adolescents. PSU refers to a behavioral pattern where a person excessively uses their smartphone in a way that significantly interferes with daily life. This research involved 1,187 families with kids between the ages of 14–18 years old over a six-month period.
The findings of these two studies point to two facts about parenting in the digital age: tools like Apple Screen Time can aid in monitoring, but they’re most effective when paired with open communication and clear rules.
Here are some practical steps you can take:
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
If you’re frustrated that your child keeps finding a way around their screen time limits, you’re not alone. Read on to learn some common ways kids turn off Screen Time and what parents can do about it.
As you get ready to hand your child their new device, it’s useful to create a family texting contract with expectations on how they will behave with their new privileges. Here’s how to do it.
One of the most effective methods for keeping your child safe online is to have regular conversations about their internet activity and what they encounter. Here are some conversation-starters to get you going:
✍️ President Biden has signed the REPORT Act into law. The bipartisan bill requires online platforms and social media companies to report child sex trafficking and online enticement to National Center for Missing and Exploited Children’s tip line. The bill is the first major piece of legislation that would put enforcement and accountability mechanisms on social media platforms in years, according to the senators behind the bill.
⚖️ In response to the TikTok ban, both TikTok’s parent company ByteDance and a group of TikTok creators have officially sued the U.S. government.
🐤 New product feature: BrightCanary now displays deleted text messages in your child’s text threads. Download BrightCanary on the App Store today!
Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
Before we talk about child online safety legislation, let’s talk about seat belts.
In the 1980s, states began implementing laws requiring people to wear seat belts in cars. Despite studies from the 1950s demonstrating that seat belts save lives, it wasn’t until these laws were implemented that buckling up became routine. You enter a car, you fasten your seat belt. It’s a simple safety step that’s also mandated by law.
However, between the 1950s and 1980s, there was a time when people knew that seat belts were protective — but they didn’t necessarily use them. Later, laws were passed because safety protections can help save lives.
A similar discussion is happening today with social media. A growing body of research points to social media’s negative effects on kids, ranging from their well-being to their brain development. But there are no national regulations to safeguard children on social media, and those that are passed at the state level face significant legal pushback from major tech companies.
In Congress, several pieces of legislation that impact children online are currently under discussion. Let’s look at a few of them making headway this legislative session:
Kids Online Safety Act (KOSA): Sets new safety standards for social media companies and holds them accountable for protecting minors. Users would also be allowed to opt-out of addictive design features, such as algorithm-based recommendations, infinite scrolling, and notifications. The bill awaits vote in the Senate and has been introduced in the House.
Children and Teens’ Online Privacy Protection Act (COPPA 2.0): Updates the Children’s Online Privacy Protection Act (COPPA). This measure would make it illegal for websites to collect data on children under the age of 16, outlaw marketing specifically aimed at kids, and allow parents to erase their kids’ information on websites. The bill awaits vote in the Senate.
Sammy’s Law: Would require social media companies to integrate with child safety software, making it easier for parents to supervise their children’s online activities. The bill is currently in the House subcommittee on Innovation, Data, and Commerce.
Platform Accountability and Transparency Act (PATA): Provides protected ways for researchers to study data from big internet companies, focusing on how these platforms impact society. PATA would make it clearer how online platforms manage children's data and the effects of their algorithms. The bill was read twice in the Senate and referred to committee.
Also worth noting is the American Privacy Rights Act (APRA), a significant bipartisan measure yet to reach committee. It would establish national privacy and security standards, requiring transparent data usage and giving consumers, particularly children, greater control over their personal information.
In the future, we may look back at this period and wonder how we didn’t have stricter measures in place to protect kids online — just like that period when we didn’t wear seat belts. You can talk to lawmakers about the importance of children’s online safety legislation. To find your representative, go to congress.gov/members/find-your-member.
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
You know you should monitor your child’s texts, but actually understanding their messages is a whole other story. Like previous generations of kids, Gen Z and Gen Alpha use slang to put their own spin on the way they communicate. We break down what it all means, bruh.
While it’s responsible to monitor your child’s text messages, that doesn’t mean anything goes. Here are some of the top mistakes parents make when monitoring their child’s texts so you can avoid making them yourself.
How will you check in with your child this week? Save these conversation-starters for your next tech check-in.
📵 Following a smartphone ban in Norway schools, middle school kids report feeling mentally healthier and performing better academically. After three years of the policy, girls’ visits to mental health professionals decreased by 60%, and both boys and girls experienced 43–46% less bullying.
🕯️ According to a new survey by Ohio State University, a majority of parents experience isolation, loneliness, and burnout from the demands of parenthood. A whopping 62% feel burned out by their responsibilities as a parent. Parental burnout researcher Kate Gawlik, DNP, stressed the need for self-care and the value of connection, encouraging parents to find local parent groups.
🐤🤖 Did you know? BrightCanary features an AI chatbot called Ask the Canary: an easy way to anonymously get answers to your toughest parenting questions. Find it in the BrightCanary app.
Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
Another week, another round of TikTok drama: last week, the House passed a bill requiring the forced sale or ban of TikTok in the U.S.
The bill, titled the Protecting Americans from Foreign Adversary Controlled Applications Act (H.R. 7521), requires TikTok’s Chinese parent company, ByteDance, to sell the app’s U.S. operations within nine months (previously six, but the latest version of the bill extended the timeline with the potential to become a full year). Otherwise, much like dancing in Footloose, it would be illegal for TikTok to be available for download in U.S. app stores.
Lawmakers claim that TikTok poses a national security threat because the Chinese government could potentially access the data of U.S. users and use the platform's algorithm to influence American public opinion. TikTok stated it has never been asked to provide U.S. user data to the Chinese government, wouldn’t do so if asked, and doesn’t tailor content based on political motives.
What happens next? The proposal sailed through a House panel earlier this month, but faced an uncertain future in Congress until it was attached to a foreign aid package that will send funds to Ukraine and Israel, making it more likely to be passed in the Senate. If passed, the bill could land on President Biden’s desk in the next week.
This doesn’t mean TikTok will be banned in time for Mother’s Day. The platform would have nine months to find a buyer, although it’s not clear if TikTok’s algorithm — aka the thing that makes it so compulsively scrollable and knows exactly which ASMR cooking videos to show you — will come with it.
If your child asks about the TikTok ban: Explain the topic in a way that’s appropriate for your child. The platform hasn’t been banned, but lawmakers are asking TikTok to find a new owner because they’re worried about the way they’re treating our personal information. Now’s a great time to explain how social media algorithms work, why it’s important to think critically about the information we consume, and how a bill moves through Congress.
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
The American Psychological Association (APA) recently released another report on social media, calling on tech companies to fundamentally redesign social media to correct harmful features that are unsafe for adolescents.
Last year, the APA issued a health advisory on social media use in adolescence, in which the organization recognized the potential social benefits of social media but called out the need to protect kids from harmful content and problematic behaviors. This new report highlights the fact that companies and policymakers “still have made few meaningful changes” (translation: haven’t taken actions that’ll actually help kids).
The report highlights the ways in which common features of social media, such as infinite scroll and notifications, negatively impact kids. It also suggests paths forward for companies and policymakers. Some takeaways:
Apple Screen Time is a great tool to set limits and restrict certain activities. But Apple parental controls aren’t foolproof. We break down common complaints and new ways to keep your kiddo safe online.
Whether your kid is already obsessed with their Switch or wants a console to play with friends, you should know that Nintendo Switch parental controls exist, and you can use them to set time limits, limit certain games, and more.
How will you check in with your child this week? Save these conversation-starters for your next tech check-in.
🤔 What are social media algorithms, and how should you talk to your kids about them? BrightCanary CEO Karl Stillner writes for the Family Online Safety Institute about what parents should know.
🚫 Meta has rolled out new tools to help protect against sextortion and intimate image abuse on Instagram and Facebook.
👀 Do you ever look through your teen’s smartphone? According to Pew Research Center, 50% of parents say they do, and 47% say they set time limits on their teens’ phone use.
📞 The latest Gen Z trend: dumbphones. In other words, flip phones are back (here are our recommendations).
Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
If your teen suddenly has a new lexicon of mental health terms, like “trauma response” and “major depressive disorder,” TikTok may be to blame. A poll by EdWeek found that 55% of students use social media to self-diagnose mental health conditions, and 65% of teachers say they’ve seen the phenomenon in their classrooms.
“Kids are all coming in and I’m asking them, ‘Where did you get this diagnosis?’” said Don Grant, national adviser for healthy device management at Newport Healthcare, in an interview with The Hill. Grant said he would get responses such as “Oh, there’s an [influencer],” “Oh, I took a quiz,” or “Oh, there’s a group on social media that talks about it.”
Social media can help kids understand their feelings and find ways to cope. The EdWeek poll found that 72% of educators believe social media has made it easier for students to be more open about their mental health struggles. And it makes sense that kids would turn to a space they know — social media and online groups — to get information, rather than finding a mental health professional first (or talking to their parents).
However, the topic gets tricky when you consider the fact that social media sites don’t exactly verify that the people sharing medical advice are, in fact, medical experts. While there are plenty of experts sharing legitimate information online, there are also influencers who are paid to talk about products that improved their anxiety and off-label medications that cured their depression.
Big picture: Self-diagnosing on social media is also problematic because algorithms can create a self-fulfilling prophecy. Most algorithms, like TikTok, use a user’s activity to determine what they see next on their feed. If a teen thinks they have depression, they’ll see more content about depression — which may confirm their self-diagnosis, even if they aren’t clinically depressed.
As parents, it’s important to talk to your child about mental health, how to cope with big emotions, and what to do if they need a professional. But it’s also essential to know where they’re getting their mental health information and what they’re seeing on their social media feeds.
Don’t dismiss their feelings outright — be curious. Talk to your child about verifying their sources of information. If they’re getting medical advice from an online creator, are they an actual doctor or therapist? Or are they simply someone who’s popular online?
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
Gov. Ron DeSantis recently signed a bill that bans kids under 14 from creating social media accounts and requires parental consent for kids under 16. The bill requires that companies delete accounts belonging to 14- and 15-year-olds and implement age verification measures to ensure that kids aren’t lying about their ages.
Florida’s bill is the most restrictive social media ban in the nation, and that’s after DeSantis vetoed an earlier version of the bill that would have banned all accounts for kids under 16. At the bill-signing ceremony, Republican Speaker Paul Renner said, “A child in their brain development doesn’t have the ability to know that they’re being sucked into these addictive technologies and to see the harm and step away from it, and because of that we have to step in for them.”
Legal upheaval: The bill takes effect Jan. 1, 2025, pending any legal challenges. Tech industry groups have already come out against the bill, including NetChoice, an association that represents major social media platforms and is currently battling with the Supreme Court over a separate social media law.
“This bill goes too far in taking away parents’ rights,” Democratic Rep. Anna Eskamani said in a news release. “Instead of banning social media access, it would be better to ensure improved parental oversight tools, improved access to data to stop bad actors, alongside major investments in Florida’s mental health systems and programs.”
In our last issue, we covered Utah’s decision to repeal and replace its social media law after months of legal challenges that delayed the bill’s implementation. Although DeSantis and Renner have signaled that they’re ready to fight to keep Florida’s social media ban in place, time will tell whether or not Florida’s kids will have to wait until their sweet 16 to get on Snapchat.
How will you check in with your child about online safety this week? Save these conversation-starters for your next check-in.
Sleep can impact everything from brain performance, to mood, to mental and physical health. Our children aren’t getting enough sleep, either, and screens are one of the prime suspects. But how does screen time affect sleep?
Pinterest use is up among teens. Gen Zers are using the website as a canvas for self-expression and exploration. Learn more about how to keep your child safe on the site with Pinterest parental controls.
😮💨 What is the “mental load” of parenting, and how does it affect your emotions, sleep quality, and job performance?
🚩 What are the red flags that you need to worry about your child’s mental health? Save this list from Techno Sapiens.
🤝 Rules and restrictions aren’t the end-all, be-all to parenting in the digital age — you also need a healthy, emotionally rich relationship with your teen. Read more at Psychology Today.
📵 When it comes to protecting kids’ mental health, Florida’s social media ban won’t be that simple, writes David French for the New York Times.
Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
Last year, Utah passed a first-in-the-nation law that prevented kids from accessing social platforms without parental consent, among other restrictions. Fast-forward to now, and Utah Gov. Spencer Cox signed a pair of bills into law that repeal and replace almost everything. What happened?
As a recap, the 2023 Social Media Regulation Act: “required parental consent before kids can sign up for sites like TikTok and Instagram, prohibited kids under 18 from using social media between the hours of 10:30 p.m. and 6:30 a.m., require age verification for anyone who wants to use social media in the state, and sought to prevent tech companies from luring kids to their apps using addictive features,” via NPR.
Following major Big Tech lawsuits, Utah’s legislature recently passed H.B. 464 and S.B. 194. The new bills maintain age verification but repeal the ban on addictive design features, only require platforms to obtain parental consent if a child attempts to change certain privacy settings, and don’t require platforms to enable parental controls unless the minor agrees.
Like the previous version, the new legislation creates a process where parents can take social media companies to court. Parents can sue for a minimum of $10,000 per incident if a child has an “adverse mental health outcome” as a result of excessive social media use.
Big picture: Utah’s about-face underscores both the importance and difficulty of implementing social media regulation. After signing the Social Media Regulation Act into law in 2023, Gov. Cox nearly dared critics to sue the state over the law — and they did. NetChoice alleged the restrictions violated First Amendment free speech protections, and Foundation for Individual Rights and Expression filed a second lawsuit claiming that the age verification requirement is unconstitutional. Florida’s own social media ban, which was recently signed into law, faces similar legal challenges and delays.
Utah’s “repeal and replace” version of the bill aims to address some of the concerns raised in the lawsuits, while still taking steps to protect kids online. Sen. Mike McKell, one of the revised bills’ authors, said that there is data to support — and justify — the state’s push to put guardrails around social media use at the state level.
“One of the bars that we have to overcome in legislating when we’re looking at First Amendment issues is whether there is a compelling state interest,” he told the Salt Lake Tribune. “We’re trying to tell the court explicitly why we’re passing this. Here’s the intent behind it. Here’s what we’re seeing in our state and why we’re passing this law.”
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
It’s another bad PR week for Meta: the Wall Street Journal reports that federal prosecutors are looking into whether Meta, the parent company behind platforms such as Facebook and Instagram, facilitate and profit from the online sale of drugs.
It’s alarmingly easy to find controlled substances for sale online. A 2021 report by the Tech Transparency Project found that it takes just two clicks for kids to find potentially deadly drugs for sale on social media. According to the Drug Enforcement Administration, “Drug traffickers have turned smartphones into a one-stop shop to market, sell, buy, and deliver deadly, fake prescription pills and other dangerous drugs” — which can easily contain deadly doses of fentanyl.
US prosecutors sent Meta subpoenas last year and have been asking questions as part of a criminal grand jury probe. They have also requested records related to drug content or illicit sale of drugs via Meta's platforms.
Bottom line: Investigations don’t always lead to formal charges, but this report places even more scrutiny on social media companies and how accountable they are for the content posted on their platforms. In a statement, a spokesperson for Meta said, “The sale of illicit drugs is against our policies and we work to find and remove this content from our services. Meta proactively cooperates with law enforcement authorities to help combat the sale and distribution of illicit drugs.”
We advocate for regular conversations about tech use and online safety. But how do you start those chats? We’re launching a new section this week: conversation-starters to kick off important dialogues with your kiddo about their devices, online interactions, and more. How will you check in with your kid about online safety this week?
You check your child’s phone or get an alert from your monitoring app, and you learn they’ve been messaging friends about drugs or looking at drug-related content online. Here’s what to do next.
“Sexting” refers to sending or receiving sexually explicit videos, images, or text messages. Here are some tips to talk to your teen about sexting, including the potential consequences and a plan for safe texting practices.
🕒 TikTok ban update: H.R. 7521 is sitting in a Senate committee, which is kinda like the waiting room of bills. The measure would ban applications controlled by foreign adversaries of the United States that pose a national security risk, and it unanimously passed the House earlier this month. The vote was held following a closed-door security briefing about TikTok’s risks, and a bipartisan group of legislators are pushing to declassify that information and hold a public hearing. Sens. Richard Blumenthal and Marsha Blackburn said, “As Congress and the Administration consider steps to address TikTok’s ties to the Chinese government, it is critically important that the American people, especially TikTok users, understand the national security issues at stake.”
📵 The costs of a phone-based childhood are harming our kids, writes social psychologist Jonathan Haidt.
👻 Snapchat is rolling out a feature that makes the messaging experience more like texts. The messages won’t vanish, but both users have to opt-in to the new setting.
👀 72% of teens feel peaceful without their smartphone, according to a new Pew Research Center survey — but 46% of teens say their parents are sometimes distracted by their phone when they’re trying to talk to them.
Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
Today, the House overwhelmingly voted to pass a bill that would effectively ban TikTok in the United States. The bill now heads to the Senate, where its future is less certain. The measure, H.R. 7521, would ban applications controlled by foreign adversaries of the United States that pose a clear national security risk.
For years, US officials have dubbed TikTok a national security threat. China’s intelligence laws could enable Beijing to snoop on the user information TikTok collects. Although the US government has not publicly presented evidence that the Chinese government has accessed TikTok user data, the House vote was preceded by a classified briefing on national security concerns about TikTok's Chinese ownership.
If H.R. 7521 is passed, ByteDance will have 165 days to sell TikTok. Failure to do so would make it illegal for TikTok to be available for download in U.S. app stores. On the day of the vote, TikTok responded with a full-screen pop-up that prompted users to dial their members of Congress and express their opposition to the bill. In a post on X, TikTok shared: “This will damage millions of businesses, deny artists an audience, and destroy the livelihoods of countless creators across the country.”
"It is not a ban,” said Representative Mike Gallagher, the Republican chairman of the House select China committee. “Think of this as a surgery designed to remove the tumor and thereby save the patient in the process."
The bottom line: The bill passed the House Energy and Commerce Committee unanimously, which means legislators from both parties supported the bill. Reuters calls this the “most significant momentum for a U.S. crackdown on TikTok … since then President Donald Trump unsuccessfully tried to ban the app in 2020.” The TikTok legislation's fate is less certain in the Senate. If the bill clears Congress, though, President Biden has already indicated that he would sign it.
If your child uses TikTok, it’s natural that they may have questions about the ban (especially if they dream of becoming a TikTok influencer). Nothing is set in stone, and it’s entirely possible that TikTok would simply change ownership. However, this is a good opportunity to chat with your kids about the following talking points:
Set your child’s account to private, limit who can message them, and limit reposts and mentions. With a few simple steps, you can make Instagram a safer place for your kid. Here’s how to get it done.
Yikes — you found out that your child has been sending concerning videos, images, or messages to someone else. We break down some of the reasons kids send inappropriate messages and how to approach them.
🏛️ An update on Florida’s social media ban: as expected, Governor Ron DeSantis vetoed a bill that would have banned minors from using social media, but signaled that he would sign a different version anticipated from the Florida legislature.
📵 Nearly three-quarters (72%) of U.S. teens say they feel happy or peaceful when they don’t have their smartphones — but 44% say they feel anxious without them, according to Pew Research Center.
📖 Do digital books count as screen time? The benefits of reading outweigh screen time exposure, according to experts.
🗺️ How can parents navigate the challenges of technology and social media? Set limits, help your child realize how much time they spend on tech, and model self-restraint. Check out these tips and more via Psychology Today.
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
A Florida bill that bans minors from using social media recently passed the House and Senate. The bill, HB1, is now on Governor Ron DeSantis’ desk. He’ll have until March 1 to veto the legislation or sign it into law.
DeSantis has previously said that he didn’t support the bill in its current form, which bars anyone younger than 16 years old from creating new social media accounts — and closes existing accounts for kids 16 and younger. (DeSantis has called social media a “net negative” for young people, but said that, with parental supervision, it could have beneficial effects.) Unlike online safety bills passed in other states, HB1 doesn’t allow minors to use social media with parental permission: if you’re a minor, you can’t have an Instagram.
Even if DeSantis vetoes the bill, the fact that such an aggressive bill passed both the House and Senate with bipartisan support signals that the conversation about online safety legislation is reaching a tipping point.
The Kids Online Safety Act (KOSA), which implements social media regulations at the federal level, also recently reached a major milestone: an amended version gained enough supporters to pass the Senate. If it moves to a vote, it would be the first child safety bill to get this far in 25 years, since the Children's Online Privacy Protection Act passed in 1998.
If passed, KOSA would make tech platforms responsible (aka have a “duty of care”) for preventing and mitigating harm to minors on topics ranging from mental health disorders and online bullying to eating disorders and sexual exploitation. Users would also be allowed to opt-out of addictive design features, such as algorithm-based recommendations, infinite scrolling, and notifications.
In a previous iteration of KOSA, state attorneys general were able to enforce the duty of care. However, some LGBTQ+ groups were concerned that Republican AGs would use the law to take action against resources for LGBTQ+ youth. The amended version leaves enforcement to the Federal Trade Commission — a move that led a number of advocacy groups, including GLAAD, Human Rights campaign, and The Trevor Project — to state they wouldn’t oppose the new version of KOSA if it moves forward. (So, not an endorsement, but not-not an endorsement.)
What’s next? As of this publication, DeSantis has not signed or vetoed Florida’s social media ban. Plus, KOSA has yet to be introduced to the Senate for a vote, and it’s flying solo — there is no companion bill in the House, which would give the House and Senate time to consider a measure simultaneously.
However, the fallout from January’s Senate Judiciary Committee — in which lawmakers grilled tech CEOs about their alleged failure to stamp out child abuse material on their platforms — may build momentum for future online safety legislation. We’ll keep our eyes peeled.
Spotify offers everything from podcasts to audiobooks — and with all of that media comes content concerns. The good news: both Spotify Kids and Spotify parental controls allow kids to enjoy their tunes while keeping their ears clean.
If you remember watching the pirate-themed anime series One Piece, you might be excited about the recently released live-action remake now streaming on Netflix and eager to share your love of the show with your kids. But is One Piece for kids?
🔒 Did you know that 90% of caregivers use at least one parental control? That’s according to a new survey from Microsoft.
📱 Social media is associated with a negative impact on youth mental health — but a lot of the research we have tends to focus on adults. In order to really understand cause and effect, researchers need to talk to teens about how they use their phones and social networks. Read more via Science News.
🛑 Meta announced the expansion of the Take It Down program, which is “designed to help teens take back control of their intimate images and help prevent people — whether it’s scammers, ex-partners, or anyone else — from spreading them online.”
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
Odds are high that your child is currently involved in at least one group chat if they own a smartphone.
From social media to text messages, group chats are the modern equivalent of cliques. However, just like cliques that cluster next to lockers and gossip that spreads through whispers, group chats come with their own set of issues. It’s crucial for parents to understand this digital landscape so they can guide and support their kids through the ups and downs.
When posting on social media, teens have to negotiate the dynamics of different audiences seeing their posts. But group chats can feel more private and protected, allowing kids to share inside jokes and video calls with a smaller group of friends. As opposed to passively scrolling through a feed, these more active types of behavior can support greater perceptions of social support and belonging. Being part of a group chat, and keeping up with it, can help teens express their identity and feel closer to their friends.
At the same time, group chats come with risks.
We’re big proponents of staying involved in your child’s digital life. That includes setting boundaries around device usage and regularly monitoring their text threads and social media inboxes.
It’s also important to keep the lines of communication open. Ask your kid who they’re messaging, and let them know they can come to you when problems arise. You can also use a text monitoring service like BrightCanary to keep tabs on their messages and step in when they encounter anything concerning.
You know your child best. Check in with them, start the conversation about personal safety, and discuss when it’s time to leave a chat — especially if things turn harmful or make them feel bad.
Since 2018, Instagram users have had the option to create a list of Close Friends, and use it to limit who could see their Stories. Recently, Instagram expanded this option to include posts and Reels — we break down why we love this for parents.
It’s a familiar scene of modern parenting: your kid, hunched over their iPhone, furiously texting. You, dying to know what they’re saying. But should parents read their child’s text messages? If you decide to monitor your kid’s text messages on iPhone, how do you do it?
🏛️ The problems with social media got a lot of attention late last month around the Senate Judiciary Committee hearing, in which lawmakers grilled five tech CEOs about concerns over the effect of technology on youths. Following the 3.5 hour hearing, some experts say that the momentum will help pass rules to safeguard the internet’s youngest users, while others say congressional gridlock will keep potential legislation in stasis.
💼 One takeaway from the Senate hearing: don’t mess with the APA because they will fact-check your claims. After Meta CEO Mark Zuckerberg claimed social media isn’t harmful to mental health, Mitch Prinstein, PhD, chief science officer of the American Psychological Association, clapped back and accused Zuckerberg of cherry-picking from the APA’s data.
🤖 How can AI help give teens protection and privacy on social media? Afsaneh Razi, assistant professor of information science at Drexel University, writes about how machine learning programs can identify unsafe conversations online (the same approach that BrightCanary takes!).
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.