Social Media Owners Not Complying with Under-16 Ban, New Report Reveals

Image showing Featured image name

Social Media Platforms Face Scrutiny Over Under-16 Age Ban Compliance

Four months after a leading nation implemented a ban on social media use for individuals under 16, its authorities have released a comprehensive report evaluating platform compliance. The findings reveal a mixed landscape: while millions of underage accounts have been removed, significant challenges persist, and several major platforms are under investigation for potentially circumventing the new regulations. This report holds particular relevance as other countries globally consider adopting similar legislative measures to safeguard young users online.

Initial Progress: Millions of Underage Accounts Removed

The 17-page government report brings some positive news. By mid-January, approximately 4.7 million underage accounts were removed from various social media platforms. The report further projects that an additional 310,000 accounts are expected to be removed by early March 2026.

Persistent Challenges: Major Platforms Under Scrutiny

Despite the initial progress, authorities are actively investigating whether major platforms, including Facebook, Instagram, Snapchat, TikTok, and YouTube, are fully adhering to the new regulations. Each platform faces an individual inquiry, focusing on four key areas where social media owners appear to be resistant or fall short:

  • Encouraging Workarounds: Communications displayed after users enter a correct birth date (even pre-dating the new regulations) appear to encourage minors to bypass age verification systems.
  • Unlimited Verification Attempts: Platforms do not impose limits on the number of attempts users can make to correctly verify their age, inadvertently providing opportunities for underage individuals to repeatedly try until they succeed.
  • Complex Reporting Mechanisms: The process for reporting underage accounts is not as straightforward or intuitive as legislators intended, making it difficult for concerned users or parents to flag non-compliant accounts.
  • Insufficient Age-Gating: Social media owners have not implemented sufficient measures to prevent minors from creating new accounts in the first place, allowing underage users to register successfully despite the ban.

Tactics Used by Platforms to Circumvent the Ban

The report highlights several concerning examples of how social media platforms might be inadvertently or deliberately enabling minors to bypass age restrictions or complicating the removal process:

  • Burdensome Parental Verification: A parent attempting to block their child’s account might be asked by the platform to provide an official government document proving parenthood. If the parent lacks the time or resources to obtain such notarized documentation, the underage account remains active.
  • Lack of Proactive Age Verification: A child creates an account, falsely claiming to be 16 years old. No follow-up message requesting age verification is ever sent, allowing the account to remain operational.
  • Flawed Facial Recognition: A child informs the platform they are 14, but the platform offers facial recognition for age verification. If the scan incorrectly estimates the user’s age as 16, the account is then permitted to remain active, overriding the declared age.

These examples underscore the ongoing challenges in enforcing age restrictions effectively and highlight potential loopholes that minors can exploit or that platforms may not be adequately addressing. For more insights into how young people might be finding ways around these bans, explore our related article: Australia’s Social Media Ban for Under-16s: Are Teens Finding Workarounds?

Broadening the Definition of Social Media and Imposing Penalties

In response to these challenges, authorities have further refined their definition of social media to encompass platforms that include “addictive or other harmful characteristics, fundamental to their design.” This updated definition aims to address the inherent design features that can contribute to problematic use, particularly among younger audiences.

The new definition specifically references features such as:

  • Infinite scrolling of content, designed to maximize engagement.
  • Content reactions that promote social comparison among users.
  • Time-limited content, which compels users to constantly check their phones to avoid missing out.

A decision on whether any of the major technology companies will face penalties for non-compliance is anticipated by mid-2026. This extended timeline allows for thorough investigations and ensures a fair assessment of each platform’s adherence to the regulations. The increasing global concern over the impact of social media on mental health has even led to significant legal actions, as seen in cases like the Social Media Addiction Lawsuit: Meta, Google, and the Parallels to Big Tobacco.

The Path Forward: Ongoing Investigations and Recommendations

One thing is clear: the authorities are taking the issue of social media regulation exceptionally seriously. Legislators are focused on completing at least some of the ongoing investigations by mid-2026. Concurrently, they aim to continue communication with platform owners and refine their recommendations by June 2026, striving for a more robust and effective regulatory framework.

Frequently Asked Questions (FAQ)


Why did the nation implement a social media ban for those under 16?

The ban was implemented to protect minors from the potential harms associated with social media use, including exposure to inappropriate content, cyberbullying, and addictive design features that can negatively impact mental health and development.


Which social media platforms are currently under investigation?

According to the report, Facebook, Instagram, Snapchat, TikTok, and YouTube are all facing individual investigations regarding their compliance with the new age verification and enforcement regulations.


How are authorities defining “social media” in the context of this ban?

The definition has been expanded to include platforms with “addictive or other harmful characteristics, fundamental to their design.” This covers features like infinite scrolling, reactions that promote social comparison, and time-limited content designed to keep users constantly engaged.


What are the potential consequences for social media companies found to be non-compliant?

While specific penalties are yet to be announced, a decision on whether major technology companies will be fined for non-compliance is expected by mid-2026. The intent is to ensure platforms adhere strictly to regulations designed to protect underage users.


What role can parents play in ensuring their children comply with age restrictions on social media?

Parents play a crucial role by actively monitoring their children’s online activities, discussing the risks of social media, and utilizing parental control features where available. While platforms are responsible for enforcement, parental involvement remains key to safeguarding minors online. They should also be aware of the complexities involved in reporting underage accounts and advocate for simpler, more intuitive reporting mechanisms.

Source: The Conversation / eSafety Comissioner
Opening photo: Gemini

About Post Author