Privacy and Australia’s Social Media Minimum Age Requirements

From 10 December 2025, some social media platforms operating in Australia will have to take reasonable steps to prevent Australians under the age of 16 from creating or keeping an account on their platform.  This new law is called ‘social media minimum age’ (SMMA) but sometimes it is also referred to as the social media ‘ban’, ‘delay’, ‘age restriction’, or ‘age obligation’.

A lot has been written about the new law which will  require social media platforms to introduce robust age verification processes to ensure compliance with limiting access to those 16 or over. While these measures are aimed at protecting younger users, they also raise complex privacy questions, particularly around the collection and storage of personal information used for age assurance. 

Platforms will need to balance the need to verify users’ ages accurately against the obligation to minimise the amount of data collected from individuals, especially minors. This creates a tension between user privacy rights and regulatory compliance that must be carefully managed to avoid unintended consequences.

In this post, we consider the new requirements and the privacy issues associated with implementing the new age restriction requirements, with a quick overview of what is proposed.  We’ve also included a Checklist for compliance with Australia’s SMMA Requirements.

Australia’s new SMMA Requirements

The new SMMA requirements provide that social media platforms that are classified as ‘age-restricted’ must take reasonable steps to prevent Australians under 16 from creating or maintaining an account.  These rules are part of the remit of the eSafety Commission which is to keep people safe online.  Since 2021, and the introduction of the Online Safety Act 2021 (Cth), the eSafety Commission has been focusing ensuring that social media, messaging, gaming and app services and website providers take reasonable steps to keep Australians safe online, with an ever-increasing portfolio. 

The new SMMA requirements have been included in the Online Safety Act (via the Online Safety Amendment (Social Media Minimum Age) Act 2024).

Which platforms will be age-restricted?

SMMA applies to age-restricted social media platforms – which are those where users can create profiles, share content, and interact publicly or privately. This includes mainstream platforms popular with young people, such as Instagram, TikTok, Snapchat, and similar services.

Platforms that do not allow user-generated content, operate primarily as messaging services without social media features, or are intended for professional networking (such as LinkedIn) may not be covered. 

Additionally, platforms with limited or no interactive social features are typically excluded.

Platforms need to self-assess if they are age-restricted social media platforms as defined. The eSafety Commission has published guidance to help online service providers assess whether their services are age-restricted social media platforms and therefore subject to the SAMMA requirements. Steps to be taken to determine if you are an age-restricted social media platform include:

  1. Is the service an ‘electronic service’?
  2. Is any of the material on the service accessible to, or delivered to, one or more would-be end-users in Australia?
  3. Does the service allow end-users to post material on the service?
  4. Does the service allow end-users to link to, or interact with, some or all of the other end-users?
  5. What is the purpose of the service? Does the purpose include enabling online social interaction between two or more end-users?
  6. Is online social interaction the sole purpose, or a significant purpose?
  7. Is the service excluded under the legislative rules?

Who is regulating SMMA?

There are three bodies involved with overseeing SMMA. The Minister for Communications, OAIC, and the eSafety Commission share roles in legislating, regulating, and enforcing the scheme.

The Minister for Communications is responsible for making legislative rules regarding SAMMA.

eSafety is responsible for specifying the reasonable steps that age-restricted social media platforms must take to copy with their SMMA obligations. The eSafety regulatory guidance is here.

The OAIC focuses on privacy compliance.  More specifically, the role of the OAIC specifically is to oversee the compliance and enforcement of the privacy provisions set out in Section 63F of Part 4A of the Online Safety Act.  

What privacy obligations apply under the SMMA scheme?

Part 4A of the Online Safety Act operates alongside the Privacy Act and introduces additional, more stringent privacy obligations when handling personal information to comply with the SMMA requirement. Part 4A applies to both age-restricted social media platform providers and third-party age assurance providers when handling personal information for social media minimum age SMMA compliance purposes.

In summary, the Part 4A privacy obligations are:

  • Purpose limitation (s 63F(1)): An entity that holds personal information about an individual that was collected for the purpose of (or purposes including) the SMMA obligation must not use or disclose the information for any other purpose. The following exceptions apply:
    • In circumstances where APP 6.2(b), (c), (d) or (e) apply; or
    • With the voluntary, informed, current, specific and unambiguous consent of the individual (s 63F(2)).
  • Information destruction (s 63F(3)): An entity that holds personal information about an individual that was collected for the purpose of (or purposes including) the SMMA obligation must destroy the information after using or disclosing it for the purposes for which it was collected.

Part 4A recognises that entities may handle personal information when undertaking age assurance for SMMA compliance purposes. The provisions aim to ensure that age assurance is conducted in a manner that protects user privacy, particularly for children, and that platforms are compliant with both online safety and privacy laws.

What are the privacy issues with SMMA Requirements?

Key privacy issues arising from SMMA requirements include the collection and handling of children’s personal information (which otherwise would not be required) and potential risks of data breaches. 

Platforms must be transparent about how age verification data is collected, stored, and used, and ensure robust safeguards against unauthorised access. 

Additionally, there is a need to balance compliance with minimum age requirements while minimising the amount of personal data collected and respecting and being able to support the exercise of users’ rights under the Privacy Act 1988.

What privacy steps should regulated platforms take?

The OAIC has published Privacy Guidance on Part 4A (Social Media Minimum Age) of the Online Safety Act 2021 for age-restricted social media platforms and third party age assurance providers to help them address privacy issues with the new requirements.

The Guidance lists the following key considerations for ensuring SMMA compliance with Part 4A and the Privacy Act:

  • Necessity and proportionality: Use an age assurance method (or combination of methods) that is both necessary for SMMA compliance purposes and proportionate to the legitimate aim of preventing age-restricted users from having accounts.  Low-intrusion techniques  should be preferred with an escalation to a more intrusive personal information handling only as necessary.
  • Privacy by design: Take a privacy by design approach and consider the privacy impacts associated with each age assurance method (e.g. inference, estimation and verification) and whether the circumstances surrounding the specific chosen method(s) justify the privacy risks.
  • Privacy impact assessment: Undertake a PIA when choosing an age-assurance method(s) to identify potential privacy impacts at the outset and implement recommendations to manage, minimise or eliminate them. 
  • Data minimisation: Minimise the inclusion of personal and sensitive information in age assurance processes. Only retain enough personal information in outputs to meet defined purposes, such as to explain the measures implemented for a user and to facilitate reviews or complaints, then destroy on schedule.
  • Data deletion / destruction: Destroy any inputs that have been collected immediately once the purposes of collection have been met  (e.g. biometric information, biometric templates, identity documents). Avoid purpose ‘padding’ and ensure destruction includes caches and storage.
  • Allowable secondary use: Existing personal information used for age assurance does not need to be destroyed where the original purposes for its collection are ongoing i.e. just because you use personal information for SMMA, it doesn’t need to be deleted provided there is a continuing allowable purpose of use.
  • Consent for use of SMMA data: Take care when designing consent requests for secondary uses and disclosures of personal information collected for SMMA. Secondary use and disclosure should be strictly optional and easily withdrawn. The consent request should be clearly written and designed for all users.
  • Transparency: Always important. Use APP 5 just-in-time notices to explain key information such as what is collected, why, by whom, how long it is retained, and the user’s choices (including alternative methods and review processes). APP 1 privacy policies should be updated with clear and transparent information, with clear policies and procedures to facilitate this transparency.

What should you be doing?

To ensure compliance with SMMA, organisations should review their current privacy practices and implement robust age assurance mechanisms that align with both OAIC and eSafety guidance.  Regular staff training on privacy obligations and the development of clear privacy policies are essential steps. 

It is also important to monitor developments in legislation and regulatory guidance to maintain up-to-date practices.

For more guidance on what you might need to do to prepare for the new SMMA requirements – download our Checklist for compliance with Australia’s SMMA Requirements.

More resources:

Privacy Guidance on Part 4A (Social Media Minimum Age) of the Online Safety Act 2021 f

Social media age restrictions hub | eSafety Commissioner

Social Media Minimum Age | OAIC

Privacy, security and training. Jodie is one of Australia’s leading privacy and security experts and the Founder of Privacy 108 Consulting.