As Australia’s age-restricted social media law takes effect, Snapchat reports locking or deactivating 415,000 accounts believed to belong to users under sixteen, while highlighting challenges in enforcement and verification
Snapchat has disclosed that it has locked or disabled more than four hundred and fifteen thousand accounts in Australia identified as belonging to users under sixteen in response to the nation’s world-leading social media age restriction law.
The actions form part of the federal government’s Online Safety Amendment (Social Media Minimum Age) Act, which came into force on December tenth, banning social media use for people under sixteen on major platforms and exposing companies to significant fines if they fail to take “reasonable steps” to block underage access.
The company said the accounts affected were those where users either self-declared an age under sixteen or were assessed to be underage using its age-detection technology.
Snapchat said it continues to lock more accounts daily as it seeks compliance with the regulatory requirements, which apply to a group of platforms including
Facebook, Instagram, TikTok, X, YouTube and others.
Australia’s eSafety regulator has reported millions of similar account removals across ten platforms since the ban’s implementation, reflecting the law’s wide reach.
Snapchat and other major tech firms have emphasised their commitment to compliance despite voicing disagreement with elements of the law, arguing that the app’s core role as a visual messaging service means restricting younger users might disrupt social connections.
The company has also pointed to “significant gaps” in age-verification systems, noting that current technologies such as facial age estimation can sometimes misclassify users, allowing some underage individuals to retain access while incorrectly locking others over sixteen.
The Snapchat disclosures come amid broader debate in Australia about the practical challenges of enforcing age restrictions online.
Some lawmakers have questioned elements of the ban’s design and called for refinements, including app-store-level age checks, while privacy and technology experts highlight the technical limitations of existing verification methods.
The evolving discussion underscores the complexity of implementing digital age-based restrictions at scale and the ongoing efforts by regulators and companies to balance child safety with user access and privacy considerations.