
Australian kids are being removed from Facebook and Instagram by Meta.
Last month, Meta started notifying Australian users aged between 13 and 15 that their accounts would be deactivated, with the shutdown process commencing on December 4. This significant change is expected to affect a whopping 150,000 Facebook accounts and an estimated 350,000 Instagram profiles. Since Threads operates as an extension of Instagram, its younger users are also part of this large-scale removal.
The full ban goes live on December 10, carrying hefty penalties for companies that don't comply. Social media platforms could face fines up to A$49.5 million (US$33 million, £25 million) if they fail to implement "reasonable steps" to prevent children under 16 from having accounts.
A Meta spokesperson recently shared with the BBC that ensuring compliance with this new legislation will be an "ongoing and multi-layered process." While Meta is dedicated to adhering to the law, the company believes a "more effective, standardised, and privacy-preserving approach" is still needed. They've suggested that the government could streamline the process by requiring app stores to verify users' ages and seek parental approval for under-16s directly when apps are downloaded. This would eliminate the current need for young people to repeatedly verify their age across different applications.
For affected teens, Meta has ensured a process to manage their data. Users identified as under 16 were given the opportunity to download and save their posts, videos, and messages before their accounts were deactivated. If a teen believes they’ve been incorrectly identified as under 16, they can request a review, which might involve submitting a "video selfie" or providing official identification like a driver's license.
Beyond Meta's offerings, the new ban impacts a wide array of other major social media platforms, including YouTube, X, TikTok, Snapchat, Reddit, Kick, and Twitch.
The Australian government asserts that this ban is crucial for shielding children from the often-harmful aspects of social media. However, critics worry it could inadvertently isolate certain groups who rely on these platforms for connection and potentially push children towards less-regulated corners of the internet.
Communications Minister Anika Wells acknowledged on Wednesday that there might be "teething problems" during the initial days and weeks of the ban. Nevertheless, she emphasized its vital role in protecting "Gen Alpha" – anyone under 15 – and future generations. Wells passionately described the ban as a way to safeguard Generation Alpha from being "sucked into purgatory by the predatory algorithms described by the man who created the feature as behavioural cocaine." She painted a vivid picture of young people becoming tethered to a "dopamine drip" from the moment they acquire a smartphone and social media accounts. The Minister also revealed she's closely monitoring lesser-known apps like Lemon8 (from TikTok's creators) and Yope to see if children might migrate to these platforms post-ban.
Earlier this week, Australia's eSafety Commissioner Julie Inman Grant reached out to both Lemon8 and Yope, which are primarily video and photo-sharing apps, asking them to self-assess whether they fall under the scope of the ban. Yope's chief executive and co-founder, Bahram Ismailau, told the BBC that while the startup hadn't yet received an inquiry from Inman Grant, they had already conducted their own assessment and concluded they are not a social media platform. Ismailau explained that Yope functions as a "fully private messenger with no public content at all," operating much like WhatsApp for "seeing your people every day and sharing your life with them safely and privately." In a proactive move, Lemon8 has reportedly stated it will exclude under-16s from its platform starting next week, even though it wasn't initially named in the ban.
YouTube, which was initially exempt but later included in the ban, has voiced its strong disapproval, labeling the law "rushed." The company argued that prohibiting children from having accounts – which typically come with parental controls – could paradoxically make its video-sharing platform "less safe."
Globally, leaders are closely watching Australia's pioneering social media ban. A government-commissioned study earlier this year shed light on the urgent need for action, revealing that 96% of Australian children aged 10-15 were using social media. Disturbingly, seven out of ten of these young users had been exposed to harmful content, including misogynistic and violent material, as well as content promoting eating disorders and suicide. The study also found that one in seven had experienced grooming-type behavior from adults or older children, and more than half reported being victims of cyberbullying.