Meta Blocks 544,000+ Accounts Under Australia’s Social Media Ban
Meta has banned about 544,000+ accounts within the first few days of enforcing Australia’s new social media rules for children. This is the first major action taken under the country’s new minimum age law regarding online safety and age restrictions.
The legislation received Royal Assent in December 2024 and the age-restriction obligations took effect on 10 December 2025. The Law prohibits designated social media platforms from allowing Australians under 16 to create accounts. Reported examples include Instagram, Facebook, Threads, X, YouTube, TikTok, Snapchat, and Reddit. The policy has gained international attention, and regulators and governments worldwide are closely monitoring its implementation and impact.
Meta reported that 330,639 Instagram accounts, 173,497 Facebook accounts, and 39,916 Threads accounts were blocked between 4 and 11 December 2025. The company claimed that the move is in line with its obligation to comply with local legislation and remain aggressive in seeking industry-wide alternatives to improve safety.
The Australian government and campaigners have justified the ban as a precaution needed to keep children away from harmful content, addictive algorithms, and exploitation on the internet. Its policy enjoys broad support, especially among parents, and has already drawn the attention of international policymakers. In the United Kingdom, the Conservative Party has even promised to take the same path if it returns to power.
However, Meta and several industry professionals believe that blanket bans are not the most effective solution. Meta stated in a blog update, “We call on the Australian government to engage with industry constructively to find a better way forward, such as incentivising all of industry to raise the standard in providing safe, privacy-preserving, age-appropriate experiences online, instead of blanket bans.”
Critics also warn that the ban can be avoided quite simply and might potentially drive young users into less secure online areas. According to some mental health advocates and youth organizations, this can be abusive to vulnerable communities, such as LGBTQ people and neurodivergent and rural children, by reducing their access to online connections and support.
As implementation continues, the Australian model will help shape the debate on global digital child safety and age verification and inform future youth access to social media.
On 10 December 2025, UNICEF warned that while age restrictions on social media aim to protect children, they can backfire if not combined with safer platform design, effective content moderation, and rights-respecting tools. The organization urged governments, tech companies, and families to work together to create inclusive, safe digital environments for young users.