Mum, can I borrow your phone? Proposed social media ban for under-16s

27 November 2024

Introduction

The Online Safety Amendment (Social Media Minimum Age) Bill 2024 (Cth) (Bill) proposes amendments to the Online Safety Act 2021 (Cth) (Act), aimed at restricting access to social media platforms for users under the age of 16. If enacted, the Bill will place a statutory obligation on social media providers to take reasonable steps to prevent users below this age threshold from creating accounts. This framework reflects a policy-driven approach to addressing the developmental and psychological risks associated with adolescent exposure to social media platforms.

Establishing a minimum age for social media use

The Bill’s introduction of a minimum age for social media use would mark a departure from current industry norms, which generally allow users 13 years or older to create accounts. This threshold derives from the US Children’s Online Privacy Protection Act 1998 (COPPA), which prohibits data collection from children under 13 without parental consent. However, COPPA is a US data protection measure pre-dating the existence of social media, not a framework grounded in developmental considerations or evidence-based risks associated with social media use. Any minimum age specified in terms and conditions is also often disregarded by children and parents.

In contrast, the Bill seeks to target risks specific to adolescents, particularly the negative impacts on mental health and well-being identified in various studies. Research cited by the Australian Government, including longitudinal data from the UK, highlights how early exposure to social media can adversely affect life satisfaction, particularly during key developmental stages. By setting the minimum age at 16, the Bill reflects the view that older adolescents are better equipped to navigate the complexities of social media environments.

Scope of platforms regulated

The Bill introduces the term ‘age-restricted social media platform’ under s63C. This definition broadens the scope of regulated platforms beyond the narrower ‘social media service’ as defined in s13 of the Act. For the purposes of the Bill, age-restricted social media platform would mean an electronic service that satisfies the following conditions:

  1. the sole purpose, or a significant purpose, of the service is to enable online social interaction between 2 or more end-users;
  2. the service allows end-users to link to, or interact with, some or all of the other end-users; and
  3. the service allows end-users to post material on the service.

The inclusion of a ‘significant purpose’ test ensures that platforms where social interaction is an important, albeit not primary, feature are captured. However, ss63C(2) clarifies that platforms with ancillary social features, such as online marketplaces where reviews or feedback are secondary, are excluded from the definition.

To further refine the scope, the Bill grants the Minister for Communications rule-making powers to impose additional conditions or exclude specific platforms or services, such as messaging apps and educational tools, where risks to users under 16 are deemed minimal. This flexibility reflects the Government’s intention to target high-risk platforms without unduly burdening those offering predominantly favourable services.

Minimum age obligation

If enacted, the Bill would impose a statutory obligation under s63D requiring social media providers to take reasonable steps to prevent individuals under 16 from holding accounts. This obligation is framed to ensure systemic compliance, focusing on preventative measures rather than penalising isolated instances of circumvention. The obligation is focused on preventing Australian children under 16 from creating and holding a social media account, not from accessing the content on the platform itself. This would mean that users under 16 could still access social media content, so long as it is accessed without logging into or holding an account with the platform. Moreover, the obligation does not preclude a parent or caregiver from providing access to their social media account for use by a child under 16.

The Bill also deliberately refrains from prescribing specific compliance measures, allowing platforms to adopt age-verification mechanisms suited to their technological and operational structures. Compliance would be assessed on an objective standard of reasonableness, with considerations including the efficacy of age-assurance technologies, proportionality of costs, and the privacy implications of the chosen methodology.

The lack of prescriptive guidance may initially present challenges for social media providers in determining the adequacy and enforcement of age-assurance measures. However, the outcomes of the Government’s age assurance trial, funded in the 2024–25 Budget, will serve as a reference point for regulated entities. Accordingly, social media providers may seek to prepare or enhance existing systems to align with the reasonable steps obligation, potentially engaging third-party solutions or collaborative arrangements with app distribution platforms to reasonably meet the minimum age obligation, if enacted.

Enforcement and penalties

The Bill imposes substantial penalties for systemic breaches of the minimum age obligation, up to a maximum of $49.5 million for bodies corporate.

These penalties are designed to ensure a meaningful deterrent effect, particularly for major platforms with significant financial resources. The penalties also align with comparable penalty provisions in the Privacy Act and Australian Consumer Law, providing consistency in regulatory enforcement.

Privacy protections

Operators of age-restricted social media platforms will presumably need to collect significant information about individuals in order to meet their age assurance obligations, potentially including copies of government identification cards.

Section 63F introduces specific safeguards for the handling of personal information collected for age assurance. Platforms are prohibited from using such data for purposes other than those directly related to compliance with the minimum age obligation, unless explicit and informed consent is obtained from the individual.

Once the purpose for which the information was collected is fulfilled, platforms must destroy the data, subject to certain existing exceptions under the Privacy Act. Non-compliance with these provisions constitutes an interference with privacy and attracts penalties under s13G of the Privacy Act, which may exceed $50 million for serious or repeated breaches.

Regulatory oversight

The Bill enhances the powers of the eSafety Commissioner under s63G, authorising the issuance of information requests to monitor compliance with the minimum age obligation. The Commissioner may also issue guidelines on what constitutes reasonable steps, informed by the outcomes of the age assurance trial, to assist platforms in meeting their obligations.

The Commissioner’s enforcement toolkit includes public notifications of non-compliance under s63J, which may have reputational implications for social media providers found to breach their obligations. These enforcement powers are complemented by increased penalties for existing provisions in the Act, such as non-compliance with industry codes and standards.

Delayed implementation and review

Section 63E provides for a delayed commencement of the minimum age obligation, with a minimum lead time of 12 months following Royal Assent. This delay acknowledges the operational changes required by platforms and allows the Commissioner to develop guidance based on the age assurance trial’s findings.

The Bill would also mandate a review of the framework within two years of its commencement, enabling the Government to assess its effectiveness and adapt to changes in the digital environment. This review may consider adjustments to the minimum age threshold, the scope of regulated platforms, or the powers available to the Commissioner.

Broader challenges and limitations

Social media providers may face key practical challenges in complying with the Bill. Besides the required investment in age-assurance technologies and the variance in effectiveness, there is also the issue of user circumvention. As acknowledged by the drafters, it will be impossible to completely prevent all Australians under 16 from holding social media accounts.

In practice, adolescents may be likely to employ a wide variety of different methods to avoid social media age-assurance. Users under 16 may provide false information, use Virtual Private Networks (VPNs) to disguise the geographic origin of their traffic, or enlist others to create accounts on their behalf. These actions may undermine the regulatory framework despite social media providers’ best efforts.

Further to this, the Australian Privacy Commissioner has expressed concerns during Senate Committee scrutiny of the Bill. The Commissioner posits that banning social media platforms for restricted users could inadvertently drive children towards lower-quality, less regulated online environments.

Simultaneously, other legislation currently before Parliament, if enacted, would empower the Privacy Commissioner to develop a Children’s Online Privacy Code, particularising the requirements of the Privacy Act for social media platforms. For more information on the Children’s Privacy Code, read our article Privacy reforms – The Children’s Online Privacy Code.

Next steps

The Online Safety Amendment (Social Media Minimum Age) Bill 2024 remains subject to passage by the Parliament, and may yet be amended before becoming law.

For social media providers, the Bill, if enacted, may present compliance challenges, particularly the significant investment in age-verification technologies, operational adjustments to align with privacy requirements, and proactive engagement with the eSafety Commissioner’s oversight processes.

The success of the Bill will likely hinge not only on the adequacy of the systems implemented by social media platforms but also on societal attitudes towards social media use by young people. Parents, guardians, and broader community stakeholders will play a crucial role in reinforcing the objectives of the Bill and fostering a safer digital environment for adolescents.

This article was written by Daniel Kiley, Partner, Nikki Macor Heath, Special Counsel and Christopher Power, Law Graduate.

Subscribe to HWL Ebsworth Publications and Events

HWL Ebsworth regularly publishes articles and newsletters to keep our clients up to date on the latest legal developments and what this means for your business.

To receive these updates via email, please complete the subscription form and indicate which areas of law you would like to receive information on.

  • Hidden
    What type of content would you like to receive from us?

Contact us