The Swamp logo

Meta Begins Removing Australian Children from Instagram and Facebook

New safety rules push younger users off social media to protect them from online risks

By Kashif WazirPublished about a month ago 4 min read

Meta, the company that owns Instagram and Facebook, has started removing Australian children from its platforms as part of a new effort to make social media safer. This major step has surprised many parents and young users, but Meta says it is necessary to protect children from online dangers. The company is now enforcing stricter age rules, and thousands of accounts belonging to underage users are expected to be removed in the coming weeks.

For years, platforms like Facebook and Instagram have required users to be at least 13 years old. However, many children created accounts using fake birthdates or with help from older friends. As a result, millions of kids around the world have been using social media even though the platforms were not designed for them. Governments and child-safety experts have raised concerns about this, especially after several studies showed that spending too much time on social media can affect young people’s mental health.

Australia has been one of the strongest voices calling for better protection for children online. The Australian government has pressured tech companies to verify users’ ages more carefully and remove children who are too young. In response, Meta has introduced new systems that can detect when a user might be underage—even if they lie about their age during sign-up. These systems use technology that examines behavior patterns, connections, and activity to estimate if someone might be a child.

Once an account is flagged for being underage, Meta sends a notification asking the user to verify their age. If they cannot provide proof—such as an ID or other age-confirmation method—the account is removed. This process is now happening across Australia. Many children have already reported that their accounts were suddenly locked or deleted, and parents are receiving messages that explain the reason.

Meta says the goal is not to punish children but to protect them. Young users can be easily exposed to harmful content, cyberbullying, fake information, and online predators. Studies have shown that children under 13 often feel pressured by likes, comments, and online popularity. They may also compare themselves to others, which can harm their confidence and mental wellbeing. By removing underage accounts, Meta hopes to reduce these risks.

Child-safety groups in Australia mostly support this move. Many organizations have complained for years that tech companies were not doing enough to enforce age rules. They argue that children should not be left alone on platforms filled with adult content, ads, and strangers. Some groups believe this new policy will encourage parents to talk openly with their children about social media and help them understand its dangers.

However, not everyone agrees with Meta’s decision. Some parents argue that their children use Instagram and Facebook responsibly, mostly to stay in touch with family and school friends. Many say that removing the accounts does not truly solve the problem—it only pushes children to create new fake accounts on other platforms, possibly less regulated ones. Some also worry that children will move to social media apps that have even fewer safety tools.

Young users are also unhappy. Many Australian kids enjoy watching videos, sharing photos, and messaging friends on Instagram. For them, losing their accounts feels like losing a part of their social world. Some have shared their frustration online, saying they feel punished even though they did nothing wrong. Others are confused about why they need to prove their age in a place they considered fun and harmless.

Meta understands these concerns but insists that child safety must come first. The company is also working on new tools for parents, including controls that allow adults to see how much time their teens spend online and what features they use. Meta says it is exploring safer experiences for younger users, but it will not allow children under 13 on regular Instagram or Facebook accounts anymore.

This decision also comes at a time when governments around the world are talking about new social media laws. Some countries want age verification to be legally required. Others want companies to build special platforms designed only for children, with no ads and strict content filters. Meta’s action in Australia may be a sign of what is coming in other parts of the world.

For now, Australian parents are being encouraged to talk with their children about why their accounts were removed. Safety experts recommend helping kids find healthier online activities, such as educational apps, kid-friendly platforms, or supervised family accounts. They also suggest teaching children how to use the internet responsibly, even if they are not on social media yet.

In the bigger picture, Meta’s move marks a major shift in how tech companies approach young users. For many years, social media platforms grew quickly by allowing almost anyone to join. Now, with rising concerns about mental health, privacy, and safety, companies are being pushed to slow down and take responsibility.

In conclusion, Meta’s removal of underage Australian users is a strong step toward creating a safer online world. While the change is difficult for many children and families, it highlights the growing need to protect younger users from harmful experiences. As the digital world continues to grow, decisions like this may shape the future of how children interact with technology—making the internet safer, healthier, and more supportive for the next generation.

***Media Reports***

social mediatechnologyeducation

About the Creator

Kashif Wazir

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments (1)

Sign in to comment
  • Amin Khanabout a month ago

    It’s a bold move, and honestly, a necessary one. With all the data coming out about the harmful impact of algorithms and unregulated content on young people’s mental health, putting up a firewall at 16 seems like a great step toward protecting children

Find us on social media

Miscellaneous links

  • Explore
  • Contact
  • Privacy Policy
  • Terms of Use
  • Support

© 2026 Creatd, Inc. All Rights Reserved.