In December, Australia became the first country to ban access to social media for children under 16. 

It was a watershed moment for an industry that derives a significant amount of revenue from underage users. It’s likely that the long-term future revenue of those users has also been of great interest to the likes of Meta, Snap, and TikTok – think of banks signing up students on college campuses.

But the move by the Australian Prime Minister Anthony Albanese’s government attempted to curb all of that while also improving the protection of children online. After countless anecdotal stories and battles over whether empirical evidence proves increased harm among teenagers who use social media, Australia decided to do something.

The something was blunt, an outright restriction of access and the deactivation of accounts attached to those under 16. The restrictions were targeted to Instagram, TikTok, YouTube, Snapchat, and X all of whom would face significant fines for not complying.

Many predicted that the ban would not have the desired effect, that for one, there would be workarounds – be it through VPNs or by spoofing the age check systems – and that it would likely instead divert teenage users towards more unregulated spaces online.

Australia fired the starting gun on bans and restrictions across the world, and lawmakers will be watching with great interest as early data begins to seep out. Earlier this month, we got a hint of how it’s going.

Not a panacea

A first-of-its-kind study by The Molly Rose Foundation, a British suicide prevention charity focused on young people, found that 61 per cent of Australian children who had accounts on the platforms before the ban came in still had access to one or more accounts.

The foundation, set up after the suicide of Molly Rose Russell at 14, who had been continually exposed to harmful content online while depressed, polled 1,050 children in Australia between the ages of 12 and 15. It also found that most platforms have retained more than half of their child users, with 53 per cent of previous TikTok users, 53 per cent of YouTube users, and 52 per cent of Instagram users still being able to access an account on those platforms.

The Molly Rose Foundation has repeatedly called for social media use by teens to be tackled by more intensive regulation of the platforms and how they operate, instead of an outright ban. It has also advocated for regulators to force the platforms to change their addictive systems and implement a duty of care on the companies themselves in order to rebalance the scales between governments and social media companies. For their part, they believe bans will provide a “false sense of security”.

“Parents are united that change is needed to protect children from appalling harm online and it is crucial we see effective action that delivers the safety and wellbeing improvements we are crying out for,” Russell said following the publication of the survey.

“The cost is too high to get this wrong by rushing into an Australia-style ban that offers the perception of security but is letting children down in practice.”

Indeed similar sentiments have been echoed in Ireland by Noeline Blackwell, the online safety coordinator with the Children’s Rights Alliance. In February, Blackwell outlined how while bans may be well-meaning they could push children to “socialise in secretive ways” beyond the reach of regulation.

“A ban does not provide the clubs, the youth workers, the safe playgrounds, the links to their families and friends in the offline world that might nourish and support children. Above all, it doesn’t address the problem,” she said.

“The problem is not children, or the fact that they’re socialising online. The problem is that the products aren’t safe enough.”

Nonetheless, while restrictions have their flaws, a plethora of other countries have followed Australia’s lead. France, Spain, Britain, Austria, Denmark and many more are in the process of introducing their own restrictions. In Ireland, the Government said that it was “working actively with like-minded member states to explore options to introduce age restrictions on the use of social media, concentrating, in particular, on those under 16 years of age”.

Regulators and courtrooms

How such bans and restrictions interact with existing regulations like the Digital Services Act in Europe, introduced to force large online platforms to minimise risk to children online, will be interesting to watch.

What’s more is the industry is also facing more intense scrutiny about how they are designed. Attention is an incredibly important metric for the industry, increasing the ability of the platforms to charge more for advertising.

In March, YouTube and Meta were found liable in a landmark case in the US. A court in Los Angeles found that both companies had intentionally designed addictive products that hooked a young user and led to her being harmed. The case is one of many and may set a precedent that will cause significant issues for the social media platforms by going to the root of how they’re built. Both Meta and YouTube intend to appeal the case.

In addition, on Wednesday, Meta was found to have been in breach of the Digital Services Act by the European Commission for failing to “diligently identify, assess and mitigate the risks of minors under 13 years old accessing their services”. The preliminary finding stated that a minor below the age of 13 can enter a false birth date that makes them more than that age to access the service “with no effective controls in place to check the correctness of the self-declared date of birth”.

The Commission also criticised Meta’s tool for reporting minors under the age of 13, saying that it is difficult to use and that even when they are reported, there “often is no proper follow-up”.

The impact and fallout from the impending court cases and how the tech firms respond to the concerns of parents, users, and governments around the world will likely inform the push for more bans and more restrictions on young people.

While early data shows that bans may not be a panacea for protecting children online, it is likely that they will form a part of governments’ response to the dangers posed to underage users online. Bans will be part of social media’s future, but so too will reform. Should more cases challenging the addictive nature of social media succeed, they will be forced into a dramatic redesign that would likely improve the safety of both adult and underage users.

Elsewhere last week…

“Delegate, don’t abdicate” has been Jack Kirwan’s mantra to turn Sprout & Co into a business on track to run 14 restaurants with 500 staff. Good hiring and a structured system are key to this phase of growth. He laid out his strategy to Stuart Fitzgerald.

DCC “unequivocally rejected” a £58-a-share takeover bid last week. Analysts expect a revised, higher offer come June, potentially validating the strategic focus on its energy business. Niall asked the question: What now for DCC after private-equity takeover bid rebuffed?

David O’Sullivan oversees how EU sanctions against Russia are devised. He spoke to Jonathan about dealing with sanctions evasion and the recent controversy with Aughinish Alumina.

Dan argued that personal taxes set at their current levels during the financial crisis are unfair – particularly when we examine what has been done with the resulting State revenues.