PRIVO Blog

Why Banning Kids from Social Media Won’t Work — And What To Do Instead

Written by PRIVO | 8/25/25 2:01 PM

Calls to ban kids from social media are echoing across EU member states. The European Commission has recently said that countries can implement their own national bans for minors on social media under the Digital Services Act. Australia’s ban has already come into force with the Online Safety Amendment (Social Media Minimum Age) Act 2024, a ban on social media for anyone under 16, from social media platforms. In the U.S., some states have also passed social media laws affecting minors. 

On the surface, banning kids may seem like the right move. The harms children face online are well documented by researchers, policymakers, and advocates. But in practice, bans are a blunt instrument. They may look like an easy win on paper, but they don’t solve the complex challenge of creating safe digital spaces for children who live in a world where online and offline are inseparable.

According to Ofcom, 96% of children in the UK ages 5–17 are online, and 97% of homes with kids have internet access. Blocking children from social media ignores this reality. Instead, we should focus on giving them safer, age-appropriate experiences online.

 

The Problem with Bans

Children have always found ways to bypass weak age gates, and social platforms have historically looked the other way—profiting from young users’ data and attention. Whistleblower testimony, government inquiries, and volumes of academic research all show how deeply platforms have failed to put children first.

What’s Really Needed

Rather than exclusion, we need robust privacy and safety by design—systems that welcome younger users into environments suited to their age and development. That means:

  • Stronger age assurance and verification so platforms know who is using their services.

  • Protections built into design, following models like the UK’s Children’s Code.

  • Parental controls and flexible ID solutions that give families safer ways to manage consent.

  • Accountability for platforms and their executives, backed by meaningful oversight.

This will require investment, but children are the future customers of these platforms. Using some of the billions made from exploiting them to instead protect and empower them is not only right—it’s sustainable.

Holding Platforms Accountable

Regulation must go further than design principles. Oversight and enforcement are essential, and platforms must face real consequences when they fail children. Some advocate for reforming Section 230 in the U.S., which shields platforms from liability, though that remains politically fraught. But other tools—such as stronger enforcement of the EU Digital Services Act, the UK Online Safety Act, and advancing legislation like KOSA in the U.S.—can help move the needle.

A Smarter Path Forward

Bans alone won’t work. Children deserve safe, age-appropriate experiences online, not exclusion. Achieving this requires shared effort—from industry, regulators, educators, professionals, and parents. With the right mix of age assurance, privacy by design, and accountability, we can create digital spaces where children can learn, connect, and grow without being put at risk.

Blunt instruments are not the answer. Smart, enforceable protections are.