Banning kids from social media sounds decisive. Bold. Protective.
But in a world where digital is woven into how kids learn, play and connect, blanket bans risk feeling more symbolic than effective.
On 28 November 2024, the Australian Government passed the Online Safety Amendment (Social Media Minimum Age) Bill 2024, introducing a mandatory minimum age of 16 for accounts on certain social media platforms. The law came into effect on 10 December 2025, and crucially, parents are not allowed to override the ban with consent.
The rationale is clear. The Government says the ban is needed to protect the mental health and wellbeing of children and teens, arguing that the risks of social media - cyberbullying, harmful content and online predators - outweigh the benefits.
The ban currently applies to YouTube, X, Facebook, Instagram, TikTok, Snapchat, Reddit, Twitch, Threads and Kick. Tech companies that fail to take “reasonable steps” to prevent under-16s from holding accounts face fines of up to $50 million.
Some platforms sit outside the ban. Messenger Kids, WhatsApp, Kids Helpline, Google Classroom and YouTube Kids are expected to remain accessible. There are also notable gaps — including gaming platforms, AI chatbots (despite recent headlines around inappropriate conversations with kids) and dating websites.
So will it work?
UNICEF Australia isn’t convinced:
“UNICEF Australia thinks that the changes won’t fix the problems young people face online. Social media has a lot of good things, like education and staying in touch with friends. We think it’s more important to make social media platforms safer and to listen to young people to make sure any changes actually help.”
At Beano Brain, we agree that keeping kids and teens safe is non-negotiable. There is clear evidence that social media can negatively impact mental health. But we also believe that a blanket ban, without acknowledging how kids actually use social platforms, is unlikely to deliver the outcomes policymakers want.
If we’re serious about safety, here’s where we think the focus needs to shift:
Parents play a critical role in shaping healthy digital habits. In our work with families in China, kids are rarely permanently banned from platforms. Instead, devices are routinely removed to prioritise schoolwork or offline activities. That breathing space allows young brains to reset, explore and be imaginative without digital overload.
Take YouTube. For kids and teens, it functions far less as a social network and far more as their primary viewing platform. A YouTube ban is, in effect, the equivalent of removing Netflix from adults - a very different cultural impact to blocking a chat-based app.
While cases like the Roblox controversy highlight real risks, kids describe safety more broadly. Exposure to negativity, online drama and influencer “beefs” matters. Many kids actively want less conflict in their feeds, not more.
Our view? Protecting kids online isn’t about switching platforms off. It’s about redesigning digital spaces so safety, creativity and connection can coexist. Bans might grab headlines but meaningful change will only come when we build systems that reflect how kids actually live, learn and socialise online.
If you'd like to learn more about how we challenge the brief, question assumptions and flip perspectives to deliver original, thought-provoking market research & strategic consulting, get in touch.