Major platforms to block users under sixteen from December 10 as Meta Platforms begins early removal of youth accounts
The Australian government is moving forward with its ground-breaking social media legislation, as major platforms prepare to bar users under the age of sixteen from accessing key services.
The law places responsibility on the tech companies themselves to enforce the age restriction, while exempting parents and children from direct penalties.
Meta Platforms, which owns
Facebook, Instagram and Threads, announced on Thursday that it will begin to delete the accounts of Australian users it believes to be under sixteen starting from December 4, ahead of the law’s official commencement date of December 10. The company has begun notifying affected users via email, in-app alerts and SMS messages, urging them to download their contacts, memories and account data before deletion.
Meta estimates the number of Australian users aged thirteen to fifteen as approximately three hundred and fifty thousand on Instagram, and one hundred and fifty thousand on
Facebook.
Under the law, named the Online Safety Amendment (Social Media Minimum Age) Act 2024, companies such as Meta, TikTok, Snapchat, X (formerly Twitter), Reddit and YouTube are required to take “reasonable steps” to prevent minors under sixteen from holding accounts.
Platforms that fail to comply may face fines of up to forty-nine and a half million Australian dollars.
Messaging services such as WhatsApp and Messenger, as well as educational and professional services, are exempt from the restrictions.
The government frames the measure as an appropriate step to protect youth physical and mental health amid rising concerns about screen time, social comparison and online harms.
Prime Minister Anthony Albanese has endorsed the law, saying it will empower parents rather than impose a top-down ban and that it signals societal expectations.
Nevertheless, experts and industry groups have raised substantial practical and ethical questions about enforcement, age verification and unintended consequences.
Meta itself acknowledged the significant risk of error in age‐estimation tools, and noted that Australian authorities recognise a “natural error margin.” Age verification methods being explored include government-issued identity checks and video selfies through a third-party provider.
Some critics warn that young people may simply circumvent restrictions by using virtual private networks (VPNs), borrowed credentials or migrate to lesser-regulated corners of the internet, exposing themselves to higher risks.
The regulator, the eSafety Commissioner, has confirmed the initial list of platforms subject to the age restrictions but noted the list may not be static; additional services could be designated in the future.
The law imposes no mandatory requirement for platforms to verify every user’s age, only to demonstrate they have taken reasonable steps.
As December 10 approaches, Meta’s early notification programme signals that Australia’s world-first ban is entering its operational phase.
The next few weeks will test how effectively major platforms can identify and remove under-sixteen users without damaging user privacy, while balancing the government’s regulatory expectations and youth access to the digital public square.