Yesterday, the world’s first comprehensive social media ban for minors came into force in Australia. The Online Safety Amendment (Social Media Minimum Age) Act 2024 requires platforms such as TikTok, Instagram, X, and Facebook to take “reasonable steps” to prevent children under the age of 16 from having an account.

In the event of non-compliance, operators face severe civil penalties of up to 49.5 million Australian dollars. Prime Minister Anthony Albanese praised the measure as taking back control from tech companies and defending children’s right to “be kids”, which supposedly gives parents “more peace of mind”.

However, this is merely the self-congratulatory façade of a policy that ignores the real, underlying problem. Now that the long-planned and long-passed law is in place, it is time for a deeper look into this form of placebo activism.

The Illusion of Control

Australia is serious about this effort: young people under 16 should no longer have access to social media. The government is touting this as a form of youth protection. In reality, it is a real-life experiment that has failed before. Since Australia is the site of the experiment, allow me to make the obvious pun that this law could return to haunt the government like a boomerang. That said, this is not something peculiar to Australia: Greece loves this idea too!

The sadness at how easily politicians boast about this supposed “protection” in public, while the reality of children is ignored, is hard to beat and makes me angry. This can be distilled to how the law is fighting against something that has long been part of everyday life: young people who know how to find their way around technology better than any supervisory authority. Anyone who believes that TikTok, Snapchat, or Instagram can be banned from a teenager’s life by age verification has either never understood the internet or hasn’t used it since 2008.

Kids are Too Smart!

The existing age restriction of 13 years (based on US regulations) could be easily circumvented. Why should it be any different now? VPNs? That is just a matter of two clicks. Secondary accounts? Set up long ago. Family Apple IDs? An obvious blind spot. And anyone who really knows their way around switches to Discord, Telegram, or Signal anyway.

The alternative apps that are now soaring up the app store charts are Yope, a relatively unknown photo-sharing app, Lemon8 (an Instagram clone from TikTok company Bytedance), or Coverstar, which describes itself as a safe alternative to TikTok for 9 to 16-year-olds. Replacements for Snapchat, Instagram, and TikTok seemed to have surfaced a long time ago.

Australian parents have also long reported that their children are painting moustaches and wrinkles on their faces to trick the age verification system. Girls use false eyelashes and exaggerated make-up to appear older. Sometimes it is as simple as changing your own date of birth.

The Problems Haven’t Disappeared, They’re Just Becoming Invisible

But the problem does not lie in evasion — that is expected. The real risk lies elsewhere: bans do not remove problematic content from the world, but from the public eye. Violent videos do not disappear. They just shift from TikTok For You pages to WhatsApp groups, private clouds, or closed Discord servers.

This is precisely the disaster of this law: services such as messaging apps and online gaming platforms are excluded from the definition of “age-restricted social media platforms” in the first draft. When young people are shunned from regulated platforms, they turn to more private channels such as WhatsApp, Telegram channels, or even dark corners of the web, such as 4chan. There, in encrypted groups where no platform algorithms can filter and no parental controls have jurisdiction, it becomes much more difficult to recognize bad actors and intervene.

The eSafety Commissioner, as part of the relevant authority in Australia itself, has expressed concerns that this restrictive approach could cause young people to “migrate to less regulated, non-mainstream services” and limit their access to critical support.

The Real Losers and Convenient Politics

In my opinion, Australia is trying to regulate a basic digital need: Connection, belonging, social participation, entertainment, or simply to answer the everyday question of “Where is everyone else right now?” Teenagers don’t search for social media because of the app. Nobody is on TikTok because of TikTok! They are simply looking for the place where their social life takes place.

The ones who suffer are those for whom these platforms actually offer added value. 15-year-old Ezra Sholl, paralyzed from cancer, said Instagram and Snapchat are a “window into the world out there” and a way for him to share his life with his friends.

Empfohlener redaktioneller Inhalt
Dieser externe Inhalt von YouTube wurde von der nextpit Redaktion ausgewählt und ergänzt den Artikel. Du hast die Wahl, ob du diesen Inhalt laden möchtest.
Externen Inhalt erlauben
Ich bin damit einverstanden, dass externer Inhalt geladen wird. Personenbezogene Daten werden womöglich an Drittplattformen übermittelt. Nähere Informationen enthält die Datenschutzerklärung.

The government is now punishing children like Ezra who use the platforms positively, rather than tackling the real problems of harmful content. The risks and benefits of social media use are individualized; a blanket ban is simply not the most beneficial solution.

The Platforms Get a Slap on the Wrist

At the same time, platform operators are getting off lightly. They must take “reasonable steps”, with the Minister suggesting they could use the same capacity they already use to identify demographics for political parties (e.g., women of a certain age in selected zip codes). The aim of the law to hold tech giants accountable is actually spot on. However, the choice of means, which merely shifts children into invisible space, proves that the aim is not to protect children, but to get them out of sight without complications.

Bans are easy to implement. Responsibility, on the other hand, is difficult. Australia has opted for the easy path — and in doing so has only pushed the difficult problems deeper into the dark recesses of the net. Instead of focusing on digital due diligence and education, the government is creating a dangerous illusion of safety that reassures parents but leaves children isolated and unprotected.

Before people in other countries, like the US, get the silly idea of celebrating this Australian law and emulating the country, I sincerely hope that this plan is thoroughly reconsidered. It won’t help the kids if they are partly banished to the dingy corner of the internet and partly cut off from participation. Let them learn media skills! Here’s a little tip: A shot of media literacy wouldn’t do us adults any harm either. Perhaps by then, we wouldn’t have to bother with laws like this!