Major platforms must bar users under sixteen from accounts starting December 10, with Meta Platforms initiating removals from December 4
Australia’s federal government is set to implement the world-first national restriction requiring most social media platforms to prevent users under sixteen from creating or maintaining accounts, as the landmark regulation takes effect from December 10 2025. Platforms that fail to take the required “reasonable steps” face civil penalties of up to A$49.5 million (about US$32 million).
Meta Platforms—owner of
Facebook, Instagram and Threads—announced it will begin deactivating the accounts of Australian users it estimates to be under sixteen starting December 4, ahead of the formal start date.
The company has begun notifying affected users via email, in-app alerts and SMS, urging them to download their data and update contact information so access may be restored once they turn sixteen.
Meta said it expects the account removals to be complete by December 10, while emphasising that full compliance will be an “ongoing, multi-layered process.”
The law emerged under the Online Safety Amendment (Social Media Minimum Age) Act 2024, which passed Parliament in late 2024 and grants the regulator eSafety Commissioner oversight of youth access to so-called age-restricted social media platforms.
These are defined as services whose chief purpose is user interaction and content creation, such as the leading apps Instagram, TikTok, Snapchat, Reddit, X and YouTube.
Messaging apps, educational platforms and gaming services are generally excluded.
According to official figures cited by Meta, approximately 350,000 Australian Instagram users and 150,000
Facebook users aged between thirteen and fifteen will be affected.
The law places no veil of liability on minors or their parents—the burden lies solely with the platforms.
Guidance issued by the eSafety Commissioner clarifies that companies will not necessarily be required to verify the age of every user; rather they must demonstrate a “reasonable-steps” approach, through a layered method of age assurance, monitoring and response.
Among the methods Meta is offering for users mistakenly notified are video selfie checks and upload of government-issued identification for age reversal.
The company has noted a “natural error margin” in its age estimation process.
Supporters of the law argue it is a necessary update to protect adolescents from negative mental-health impacts, excessive screen time and harmful online interactions.
Prime Minister Anthony Albanese described the change as an “appropriate government response” to parental concerns and societal expectations around digital safety.
However, some observers warn of enforcement and privacy challenges.
A senior age-assurance expert at a leading university noted that automated age-estimation tools carry error rates of at least five per cent and that the approach may inadvertently drive younger users toward lesser-regulated digital spaces.
The regulator has warned platforms to anticipate such displacement risks and to adapt accordingly.
With Meta signalling compliance before the formal deadline and the legal implications of failure clearly delineated, December will usher in a new era for global online-child-safety regulation as Australia takes a pioneering, high-stakes step into age-governed social media access.