Online age verification in Australia: A work in progress

16 December 2024

Australia’s social media ban for users under 16 has recently attracted worldwide attention. However, other reforms are also underway in Australia that may have all of us reaching for our IDs more often when going online, including the Children’s Online Privacy Code and new Codes of Practice for the Online Industry.

The final parliamentary sittings for 2024 passed under 16s social media ban as well as the legislation necessary for the Children’s Online Privacy Code, while earlier this year, consultation drafts of the Consolidated Industry Codes of Practice for the Online Industry (Class 1C and Class 2 Material) were released. The Industry Codes will prescribe age verification requirements intended to restrict access to material such as online pornography.

While none of these regulations have immediate effect, online content providers should begin making preparations now, and some recommendations are set out below. The legislation and codes will allow providers some flexibility in the implementation of age verification requirements, but at the cost of certainty when it comes to compliance. It is therefore essential for providers to keep abreast of developments in this area.

In particular, the Australian Government is undertaking the Age Assurance Technology Trial to ‘determine the effectiveness of available technologies to better protect young people by limiting their access to harmful and inappropriate content online’. The results of this study are expected to be pivotal for informing online providers about appropriate verification mechanisms, but the final report is not expected until mid-2025.

This article provides a brief overview of each of these developments, when they are expected to take effect and who needs to know about them.

Social media ban for users under 16

The new laws will require social media providers to ensure that Australian account holders are 16 years or older.

LegislationOnline Safety Amendment (Social Media Minimum Age) Act 2024 (Cth) amendments to the Online Safety Act 2021 (Cth)
TimingThe amendments will take effect in mid December 2025. However, the age verification requirements will not take effect until a day to be specified by the Minister.
Relevant providersAll providers of social media platforms accessible to end-users in Australia.
OperationThe requirements will apply to an age-restricted social media platform, defined as an electronic service with the sole or significant purpose of enabling online social interaction between two or more end-users; that allows end-users to link to, or interact with, some or all other end-users; and that allows end-users to post material on the service.

Legislative rules may add specifically include or exclude specific platforms.

The Minister's Second Reading Speech indicates that at least TikTok, Facebook, Snapchat, Reddit, Instagram, and X (formerly Twitter) will be 'in scope', while the rule-making power may be used to exclude 'messaging services, online games, and services that significantly function to support the health and education of users'.

Online games will also be excluded to avoid overlap with the National Classification Scheme.
Age verification requirementProviders 'must take reasonable steps to prevent children who have not reached a minimum age from having accounts', whether those accounts are created before or after the amendments take effect. Note that this does not require providers to prevent underage users from having access to content without their own account.
Other considerationsThe amendments include specific requirements for dealing with personal information collected for the purposes of compliance
How to prepareProviders should first determine whether their services are likely to fall within the definition of age-restricted social media platforms, while also keeping watch for future legislative rules excluding their type of platform due to health and education functions.

If so, it will be necessary to identify 'reasonable steps' for the provider's platform, as well as changes to sign-up processes and information handling practices, user terms, and verification processes for existing accounts.

Providers will most likely have limited access to additional guidance in Australia until the Age Assurance Technology Trial report is available and the eSafety Commissioner has issued guidance materials, so it will be important to keep any eye on these developments. It will also be worth monitoring international developments such as the UK’s Children's Code and progress of the ISO standard for age assurance (ISO/IEC 27566).

Providers should follow and consider contributing to any further consultation processes undertaken by the Government directly or through an industry association.

Children’s Online Privacy Code

Whether or not the new Code prescribes age verification requirements, providers may need to determine whether or not users are children in order to apply the personal information handling practices required by the Code.

LegislationThe Privacy and Other Legislation Amendment Act 2024 (Cth) amends the Privacy Act 1988 (Cth) requires the Information Commissioner to develop a Children’s Online Privacy Code (Code).
TimingThe amendments were passed 29 November 2024. The Code must be developed and registered mid December 2026.
Relevant providersThe Code will apply to some, but not all, entities caught by the Privacy Act.
In broad terms, the Privacy Act applies to private and non-profit organisations (APP Entities) with total annual revenue of more than $3 million, including overseas organisations with an 'Australian link'. The Australian link requirement is broad and may be satisfied where the organisation carries on business in Australia or collects or holds personal information in Australia.

However, the Code will only apply to providers of online services such as websites, apps, streaming services, social media services, online messaging and communication facilities (including online games), and only if 'the service is likely to be accessed by children', and is not a health service.
OperationThe Code will 'set out how one or more of the Australian Privacy Principles are to be applied or complied with in relation to the privacy of children'.

While the amendments themselves include very limited information as to what this means, the Office of the Australian Information Commissioner has released commentary, including a statement that the Commissioner will 'look to align the code with the UK’s Age Appropriate Design Code, while recognising there are differences in the underlying legal frameworks, and leverage the learnings of our international counterparts'.

A broad consultation process will commence in 2025.
Age verification requirementAlthough the Code will not focus on preventing access to content and services by children, providers may need to determine whether or not users are children, or likely to be children, in order to apply the different personal information handling practices required by the Code.

The Privacy Act now defines a 'child' as any individual under 18 years of age, but does not itself provide any other guidance or impose any age verification requirements.

Outcomes from other processes such as the Age Assurance Technology Trial report and consultation on the Code will enable providers to be prepared for any measures required.
How to prepareProviders should first determine whether their services are likely to be of a type to which the Code will apply.

If so, Providers should then:
  • review and consider the requirements of the UK Children's Code; and

  • monitor the Office of the Australian Information Commissioner's activities in this area over the next one to two years, including following and possibly contributing to the consultation processes directly or through an industry association.

  • Consolidated Industry Codes of Practice for the Online Industry (Class 1C and Class 2 Material)

    The Codes will require some providers of online services to apply age gating, particularly if their services provide, or permit the sharing of, material that is lawful for adults to view but prohibited for children.

    LegislationIn October 2024, consultation drafts of the Consolidated Industry Codes of Practice for the Online Industry (Class 1C and Class 2 Material) were released. Among other things, the Codes prescribe age verification requirements intended to protect children from exposure to online pornography and other potentially harmful content that is lawful for adults to view. If accepted and registered by the eSafety Commissioner (eSafety), the Codes will become mandatory under the Online Safety Act 2021 (Cth).
    TimingThe consultation period ended on 22 November 2024. It is expected that the final Codes will be submitted for registration by eSafety by 19 December 2024. However, eSafety has reserved its position as to whether or not it will endorse the Codes for registration.

    The codes will become enforceable six months after registration. After that date, a provider receiving a compliance notice from eSafety will not be in breach if they can demonstrate that they are working towards achieving compliance on or before 12 months after registration.
    Relevant providersThe Codes apply to most providers of websites, apps and other online services making content or messaging available to Australian end users, but compliance is required only to the extent that the services are provided to Australian end users.

    The specific Code applicable will depend on which of the following categories the provider falls into: social media services; 'relevant electronic services' (eg online messaging, online gaming with communication functionality); 'designated internet services' (most websites, apps and streaming services that don't fall into a more specific category); internet search engine services; app distribution services (eg app stores); hosting services (hosting material in Australia); internet carriage services; and manufacturing, supply, maintenance or installation of equipment for use of any of the preceding services.

    Only the Code 'most closely aligned with the predominant purpose' of a service will apply.
    OperationThe Codes primarily address access to material of a type that would have a X18+ or R18+ rating (or equivalent) under the National Classification Scheme, including online pornography, other high-impact material including sex, nudity, violence, drug use, language and themes. In relation to computer games, this extends to simulated gambling.

    Under the current drafts, age verification requirements will only automatically apply to a limited class of providers - particularly, providers of:
  • social media services that permit high impact online pornography and self-harm material;

  • services that provide content or apps that include X18+ material or simulated gambling games; and

  • services that have the sole or predominant purpose of providing or permitting users to share (or generate using AI) high impact online pornography.

  • However, the Codes are quite technical and the requirements of each Code vary significantly according to the category of the provider, and the types of services, technologies, content and materials involved. Even where age assurance is not specifically required, applying some of these measures may require knowledge of users' ages.
    The Codes include requirements concerning risk assessments, default safety settings, notices, filtering and flagging/reporting/complaint functionality, inclusion and enforcement of policies and user terms, investment in child protection systems, and publication of information about actions taken to reduce harm, online safety and the role and functions of eSafety.

    The Online Safety (Restricted Access Systems) Declaration 2022 (Cth) already prescribes basic restricted access system requirements for online service providers to limit access to R18+ material to persons under 18 years of age, if required to do so under the Online Safety Act. However, the Codes go well beyond this in terms of detail and breadth of requirements.
    Age verification requirementAppropriate age assurance measures must 'at a minimum include reasonable age assurance measures to … identify whether an Australian end-user is a child' and may include solutions aimed at verifying exact age or age ranges, as well as age estimation solutions. The drafts indicate that:
  • appropriate age assurance measures will include (without limitation) matching of photo identification; facial age estimation; credit card checks; digital identity wallets or systems; attestation by a parent or guardian of age or whether an Australian end-user is a child; and relying on appropriate measures already implemented by another party; and

  • self-declaration and contractual restrictions on the use of a service by children will not be considered appropriate, without more.

  • The Codes acknowledge that approaches to age assurance are still developing and may change over time; that some measures may not be accurate and may be circumvented; and that in some cases it may even be reasonable not to adopt compliance measures.

    Industry participants must take steps including risk profiling and adopting 'reasonable compliance measures', taking into account a range of factors listed in the Head Terms, including 'the technical accuracy, robustness, reliability and fairness of the solution for implementing the measure'. 
    How to prepareProviders should first determine whether their services are likely to be of a type to which one of the Codes will apply and, if so, identify and consider how to apply the relevant Code.

    Even where age assurance requirements do not automatically apply, providers will need to undertake risk assessments, privacy impact assessments and other preparatory activities in order to identify and implement necessary measures before the Codes come into effect.

    Providers should keep records of compliance activities. Some measures are limited to the extent technically feasible and reasonably practicable, but eSafety may require reports and other information to justify a provider's conclusions.

    Providers should also monitor any developments arising from the Age Assurance Technology Trial and any guidance materials issued by eSafety. Material in relation to the under 16s social media ban may be of interest, if it is released first.

    Providers should also ensure they are across the Phase 1 Consolidated Industry Codes of Practice, which apply to Class 1A and 1B material - broadly, child sexual exploitation and pro-terror material and material that deals with crime and violence, and drug-related content that would be refused classification under the National Classification Scheme.

    We will continue to keep our clients up to date with developments in this area through regular updates and this newsletter. Please contact us is you require any further information.

    This article was written by Michael Boughey, Partner.

    Subscribe to HWL Ebsworth Publications and Events

    HWL Ebsworth regularly publishes articles and newsletters to keep our clients up to date on the latest legal developments and what this means for your business.

    To receive these updates via email, please complete the subscription form and indicate which areas of law you would like to receive information on.

    • Hidden
      What type of content would you like to receive from us?

    Contact us