Treat me like a child? How the Children’s Online Privacy Code could force a privacy rethink
Market Insights
In the flurry of privacy law and social media ban amendments passed in the dying days of 2024, the prospect of a children’s privacy code seemed to fly under the radar. In 2025, we flagged to our clients (see OAIC invites consultation on Children’s Online Privacy Code) that such a code could be far more impactful to more businesses than the social media ban, despite the latter receiving the lion’s share of media attention. This appears to have been borne out in the exposure draft of the code recently released for consultation, leaving affected businesses facing some difficult threshold decisions.
On 31 March 2026, the OAIC released an exposure draft of the Privacy (Children’s Online Privacy) Code 2026, allowing until 5 June 2026 for submissions. The code must be registered by 10 December 2026, but is not expected to come into effect until some time after that. Without certainty as to how much implementation runway the code will allow, entities with any exposure to children’s personal information should start considering their options now. We explain below who the code applies to, what will change for affected entities, and why those entities should start considering certain key threshold decisions now.
The application threshold question
The draft code applies to a wide range of online services (defined to align with the Online Safety Act 2021) that are either:
- likely to be accessed by children; or
- primarily concerned with the activities of children.
The first limb has a potentially extremely broad scope, particularly when we consider the wide range of services that may be ‘likely’ to be accessed by individuals ranging in age from toddlers to 17-year-olds. Given the lack of certainty in the drafting, even determining whether a service is captured may present a challenge. It is unclear on the face of the draft code what constitutes a service that is ‘likely’ to be accessed by children.
The second limb extends the captured services to those focused on children’s activities but interacting with children only indirectly, for example via parents engaging with school apps, family photo platforms, and IoT devices like baby monitors.
The draft code does exclude online health services, however platforms offering administrative support to health services may also potentially be captured if they are not, in and of themselves, health services.
Only services within those categories are captured, so an entity which can distinguish services that are ‘adults only’ (home loan services offered by a bank, for example) will need to decide whether to implement separate processes for those versus its child-facing services.
The age assurance threshold question
The key question for affected entities will be this: do we implement complex and potentially costly age assurance measures – or do we apply the higher protections required for children to everyone?
Unless an entity takes a ‘highest common denominator’ approach, the draft code requires reasonable steps to be taken to ascertain an end-user’s age before collecting their personal information. To do so, entities must only use ‘necessary’ information, and sensitive information (such as biometrics) used for this purpose must be destroyed as soon as practicable afterwards (subject to certain exceptions).
What steps are reasonable in the circumstances should be assessed by entities with regard to the ‘risk of harm’ from the information collection, use or disclosure, taking into account a range of factors. It seems unlikely that the type of age assurance we are familiar with on, say, alcohol websites (‘Yes, I’m definitely 18, promise’) will suffice, so it may be difficult for affected entities to achieve comfort as to the sufficiency of steps put in place without utilising expensive or complex processes.
In addition, there are requirements to ascertain that adults providing consent on behalf of children actually have parental responsibility for the child. It is far from clear what processes could be put in place practicably to achieve this in a streamlined, privacy-protective manner. Current OAIC guidance points to layering of different mechanisms. On-device verification through major operating system architectures, or tokens issued by banks or telecommunications companies, could potentially be leveraged. However, it seems unlikely that any one solution will solve the problem universally.
The regulatory tension in collecting potentially sensitive information for the purpose of ostensibly providing higher privacy protections to a certain group has been noted by several observers. Moreover, the technical and operational mechanisms to implement the required age assurance and verification appear likely to be complex and possibly costly, with high impact regulatory exposure if not tackled appropriately.
The higher privacy standards for children
The draft code proposes a significant step up in privacy protections for children across a range of touchpoints, including:
- Measures must be implemented to limit default information collection to that strictly necessary (as opposed to reasonably necessary) for the entity’s service;
- Information is required to be provided to the child, even if a parent is required to consent on their behalf. This could potentially create information fatigue, especially given the nature of the target audience;
- Consents need to be unbundled and cannot prevent use of the service unless strictly necessary to provide it, in contrast with typical privacy policy and consent practices;
- Consents expire after 12 months, potentially creating a significant attrition risk, as well as administrative burden;
- Consents and explanations must be specific, effectively ruling out the sweeping catch-all references commonly seen in privacy policies and statements;
- Consents must be unambiguous, meaning that organisations may not be able to rely on broad implied consent mechanisms, as is currently standard practice;
- In what appears to be an internationally novel approach, child ‘assent’ is required in addition to parental consent in some circumstances, for example where a child opts in to collection of sensitive information or any direct marketing;
- In addition to the general requirement under the Australian Privacy Principles to obtain consent to use sensitive information for direct marketing, consent is required to use non-sensitive information about children for that purpose;
- Information given to children is generally required to be provided in an ‘age appropriate manner’. In the absence of a specific target age range for the relevant service, this defaults to a manner appropriate for 10-12 year olds and encompasses using non-text material such as videos and animations (and certainly no technical or legal language). Notably:
- This means separate versions of all relevant documents will need to be developed, potentially requiring the involvement of new professional skill sets such as child development consultants, graphic designers or animators; and
- It could pose a particular challenge for providing explanations of information handling practices (especially of technical processes such as automated decision making tools) as is also required by the code;
- Collection, use and disclosure of information must be consistent with the ‘best interests of the child’. This concept is not defined, but is likely to interpreted consistently with the UN Convention on the Rights of the Child, and could introduce human rights-style analysis into privacy compliance;
- ‘Dark patterns’ such as ‘confirmshaming’, or using euphemisms like ‘showing you more of what you want to see’ for end-user profiling activities, are prohibited;
- Children must be notified if they are being tracked by parents or third parties; and
- Children receive a right to request destruction of information, under which an entity has an obligation to destroy the information on request, unless one of several prescribed exceptions apply. This obligation could extend to backups and would not be satisfied through deidentification.
As a package, these uplifts are likely to be onerous for captured entities, and in some respects are accompanied by uncertainty as to what exactly is required, adding layers of risk that will need to be considered when deciding on an approach to tackling code compliance. Moreover, for some services, the changes could require fundamental product and user experience redesign.
The new ongoing compliance obligations
On top of these higher privacy protections, the draft code requires new organisational measures to be implemented, including annual reviews of privacy practices and annual staff training. Requests for information, correction or destruction of information need to be responded to within specific timeframes. The draft code makes Privacy Impact Assessments mandatory for any new service or significant change to a service that is subject to the code. Even more significantly, it also requires them to be recorded in a public register, introducing potential reputational and regulatory risk.
As is apparent from the above, the compliance burden associated with the Children’s Online Privacy Code is potentially substantial. Now is the time to:
- Consider whether the code might apply to any of your services;
- Decide whether to implement age-gating or a universal uplift;
- Map your current data collection, defaults, consent and marketing practices; and
- Brief your senior management and board.
The guidance accompanying the UK’s Age Appropriate Design Code, which the draft code is intended to align with, is a useful resource for understanding in greater depth the kinds of adjustments likely to be required. Otherwise, reach out to our team if you’d like to discuss how to tackle this issue or our assistance in drafting a submission to the consultation on the draft code.
HWLE Lawyers’ Privacy, Data Protection and Cyber Security team has extensive experience advising on privacy obligations and issues. If you have any questions or concerns about the upcoming Children’s Online Privacy Code and your potential obligations, please do not hesitate to contact us.
This article was written by Nikki Macor Heath, Special Counsel, with Daniel Kiley, Partner and Maximilian Soulsby, Associate.
Subscribe for publications + events
HWLE regularly publishes articles and newsletters to keep our clients up to date on the latest legal developments and what this means for your business. To receive these updates via email, please complete the subscription form and indicate which areas of law you would like to receive information on.
* indicates required fields

