Protecting Kids Online Requires a Safety Net
by Alexander Tidd
Most parents feel like raising kids in the age of social media can feel like walking a tightrope dressed in flippers. You want to empower your kids to explore, but the digital world isn’t just content and friends—it’s an evolving jungle that can mislead, alarm, and even harm.
Enter Sammy’s Law, a legislative proposal picking up momentum in Congress. It’s named after 16-year-old Sammy Chapman, who was lured by a drug dealer through Snapchat and unknowingly died from fentanyl poisoning. The law proposes that platforms like TikTok and Snapchat must allow parents to use third-party safety apps to monitor serious risks—messages about drugs, self-harm, guns, or cyberbullying. Advocates argue it’s not about turning into surveillance parents, but about building a digital safety net when we can’t be there every swipe of the day.
A Troubling Track Record
This push for policy isn’t coming out of nowhere. Experts like Jonathan Haidt, author of The Anxious Generation, argue that smartphones and social media have reshaped modern childhood—for the worse. This digital shift, which began in earnest around 2010, coincides with unprecedented spikes in anxiety, depression, and self-harm among teens. Young people are glued to screens, scrolling through highlight reels that teach comparison over creativity, anxiety over connection.
Compounding these internal pressures is the objective data. Studies show kids who spend more than three hours a day on social platforms double their risk of mental health issues like depression and anxiety. Females are particularly susceptible to self-esteem issues as they compare themselves against filtered images and polished posts. Meanwhile, increased screen use impairs sleep, and addictive algorithms hijack attention spans meant for creative play and real-world connection.
Parents Are Only Part of the Solution
Parents can’t decode every notification or decipher every acrid message. But there’s a lot they can do.
Talk it out: Encourage open conversations about content. Co-view social media sometimes and ask what it brings up for them—“How did that make you feel?” is more powerful than “No more TikTok.”
Build healthy habits: Set device limits. Tech companies often don’t offer clear data about usage; some third-party apps do—if the platform allows it.
Balance digital with real-world: Make space at home for phone-free hours—family game night, walks, bedtime stories without screens. Restoring those analog rituals keeps kids anchored to the real.
At the same time, parents need backup. We can’t track every swipe. That’s where laws like Sammy’s come in—designed not to shame or micromanage but to help catch emergencies, to give adults a chance to intervene before tragedy.
This isn’t just about sending alerts to worried moms and dads. It’s about shifting responsibility in a world run by nearly invisible tech giants.
Sammy’s Law is intentionally lean: it would only require safe, professional-developed monitoring tools—no internal police state. It would avoid political flashpoints, focusing instead on physical danger, self-harm, and bullying.By doing so, lawmakers hope to put safety ahead of surveillance.
At a broader level, Sammy’s Law joins proposed legislation like the Kids Online Safety Act (KOSA), which would hold platforms responsible for reducing harm and protecting children’s privacy. These laws echo the growing acknowledgment that digital safety training isn’t enough. Regulation, trust, and transparency also have a role to play.
We don’t have to choose between phone-free parenting and ignoring real risks. We can have both: confident kids, clear boundaries, and policy tools that give us a fighting chance when filters fail.
Sammy’s Law is not a parental mandate; it’s a safety harness. We know the most attentive parent can’t stand by and protect their kids 24/7 and that private companies shouldn’t decide alone whether a child is at risk.
Responsible parenting in the digital age means being realistic. Kids need space, autonomy, creativity, and, most of all, a world where they can grow without algorithms steering them off-course. Lawmakers, parents, and educators need to move from finger-wagging to frameworks that genuinely protect our most vulnerable explorers online and in the world.