Major platforms deactivate or restrict nearly 4.7 million child accounts after the new age-restriction law takes effect, prompting early debate on enforcement and impacts
Australia’s groundbreaking ban on social media use by people under the age of sixteen has resulted in the removal, deactivation or restriction of approximately four point seven million accounts across major platforms in the first weeks since the law took effect in December twenty twenty-five.
The policy, enacted under amendments to the Online Safety Act, requires major services such as Instagram, TikTok,
Facebook, YouTube, Snapchat, X, Reddit, Threads, Twitch and Kick to take “reasonable steps” to prevent children from holding or creating accounts, with potential fines of up to forty-nine point five million Australian dollars for non-compliance.
Data collected by Australia’s eSafety Commissioner and released by the prime minister’s office shows that social media companies have complied with the new law, cutting access for millions of accounts believed to be held by users under sixteen, and reporting their progress to regulators.
Meta, the parent company of
Facebook, Instagram and Threads, reported deactivating more than five hundred thousand suspected underage accounts in the initial enforcement period alone, while other platforms have similarly removed or restricted accounts based on age assessments and verification measures.
Australian officials, including Communications Minister Anika Wells and eSafety Commissioner Julie Inman-Grant, described the early results as an encouraging sign that the law is taking effect and that the risks of harmful online environments for children are being reduced.
They emphasised that platforms are also expected to strengthen their age-verification systems to prevent new under-sixteen accounts from being created or circumvented through false information or alternative methods.
Despite the large number of accounts affected, some challenges have emerged, with reports that minors may seek out unregulated or alternative platforms not yet covered by the law or attempt to evade age checks.
Government and industry leaders say ongoing monitoring, monthly reporting and potential expansion of the regulatory framework will be important as the policy matures.
The approach has gained international attention, with other countries considering similar age-based restrictions and looking to Australia’s experience to inform their own debates on youth digital safety.