YouTube Warns Against Australia’s Under-16 Social Media Ban

Illustration of a smartphone displaying the YouTube logo with a 16-age restriction symbol, warning icon, and lock icon over a map of Australia, representing the under-16 social media ban.

YouTube has warned that Australia’s new laws, which block users under 16 from accessing major social media platforms, could leave children less protected online once the rules take effect on December 10.

The company says the legislation—part of the Social Media Minimum Age Act—will force it to disable accounts for all under-16s, removing access to parental controls, supervised accounts, and key safety tools. Children will still be able to watch videos, but only without signing in.

YouTube said it will disable parental controls, content filters and channel‑blocking tools for teens and tweens once the service automatically signs them out, and argued that this change will stop parents from properly overseeing what their children watch. Rachel Lord, public policy senior manager at Google and YouTube Australia, said the law could “make Australian kids less safe online” and described the process as “rushed regulation.”

Government Pushes Back

Communications Minister Anika Wells criticised YouTube’s position, saying it was “outright weird” for the platform to claim children would be unsafe without its own systems.

“If YouTube is reminding us that it is not safe for kids, then that is a problem YouTube needs to fix,” she said.

The government reversed an earlier exemption for YouTube in July after the eSafety Commissioner found that children aged 10 to 15 most often named the platform as a source of harmful content.

Emerging Apps Under Scrutiny

The eSafety Commissioner has also asked two rapidly growing apps—Lemon8, owned by TikTok’s parent company, and Yope—to assess whether they fall under the new restrictions. Both apps have seen a surge in use among Australian teens in the weeks leading up to the age ban.

From 10 December, young users will be unable to upload videos, post comments, or access wellbeing features such as break reminders, which rely on logged-in accounts. YouTube Kids, the company’s separate child-focused app, is unaffected.

Lord said the legislation did not allow for “adequate consultation” and failed to address “the real complexities of online safety regulation.” Reports have suggested Google is considering a legal challenge, though the company declined to comment.

Broader Effort to Curb Harmful Online Behaviours

Wells acknowledged that the rollout may cause “teething problems,” but she argued that the changes are necessary to drive a long-term cultural shift. She said Generation Alpha has become hooked on a “dopamine drip” of constant notifications and algorithm-driven content.

She said, “With one law, we can stop predatory algorithms from pulling young Australians into harmful online environments.”

Under the Act, authorities will impose fines of up to A$49.5 million (£25m; US$33m) on platforms that fail to deactivate existing underage accounts or that allow new ones to be created. Companies must also submit six-monthly reports detailing the number of accounts under 16 detected on their services.

The ban applies to most major platforms, including Facebook, Instagram, TikTok, Snapchat, X, Twitch, Threads, Reddit, and Kick.

 

Leave a Reply

Your email address will not be published. Required fields are marked *