Exposure Draft proposes to raise privacy standards but contains flaws 19 min read
The Office of the Australian Information Commissioner (OAIC) has released an Exposure Draft of the Privacy (Children's Online Privacy) Code 2026 (the Draft Code), together with a draft Explanatory Statement.
The Draft Code proposes to introduce a number of substantial new obligations that extend well beyond the existing requirements of the Australian Privacy Principles (APPs) in the Privacy Act 1988 (Cth) (Privacy Act). If the Draft Code is implemented in its proposed form, the proposals would require fundamental changes to be made to the way many online services are made available to all users, as well as impose significantly more onerous privacy obligations on the handling of children's personal information via their online services. It introduces material new concepts—including a 'best interests of the child' test, mandatory privacy-by-default settings, strict new consent and double-consent mechanisms, and a right for both children and parents to request destruction of personal information.
The Draft Code would, in its current form, present significant implementation challenges for regulated organisations, impose a very high compliance burden and lead to potentially significant unintended consequences. It may even result in some organisations choosing to restrict children from accessing their online sites altogether.
The Draft Code is open for consultation until 5 June 2026. Organisations that may be subject to the Draft Code should consider responding via email to copc@oaic.gov.au.
Who in your organisation needs to know about this and why?
- General counsel and legal or regulatory affairs teams to assess whether and how the Draft Code will apply to your organisation, how it will interact with the APPs and the Online Safety Act 2021 (Cth) (the Online Safety Act) and the implications it may have on online services currently offered by your organisation.
- Privacy officers to evaluate the compliance steps required to implement the proposals in the Draft Code, including the potential requirement to undertake age-gating assessments, as wll as the impacts of redesigning default settings, implementing new workflows and conducting mandatory privacy impact assessments.
- Product, engineering and UX/design leads to scope technical requirements for age-appropriate notices, privacy-by-default configurations, child-friendly access mechanisms and controls enabling children to toggle non-essential data handling.
- Marketing and growth teams to understand new constraints on direct marketing to children, including the 'best interests' test and specific consent requirements.
- Business leads to make strategic decisions about the potential impact on existing online services and whether and how to implement the new obligations in relation to children.
- Board members and NEDs to consider the commercial impact, enforcement exposure and reputational risk.
Key takeaways
- Extremely broad and ambiguous scope. The Draft Code is proposed to apply to providers of social media services, relevant electronic services and designated internet services that are 'likely to be accessed by children' or 'primarily concerned with the activities of children'. There is no clear guidance on what these tests mean in practice. On its face, the 'likely to be accessed' test appears to be very broad given the propensity of children to navigate the internet broadly via search engines and other means. Any consumer-facing website, online service or app could be caught.
- Upfront age assessment criteria. The Draft Code proposes to require entities to either conduct an up-front assessment of the age of an end user of the applicable online service, or to apply the protections of the Draft Code to all end users of that service. Entities that provide services to a range of potential end users would need to either create bifurcated processes and specific systems for different classes of users, or fundamentally reshape the privacy posture of the service. If the onerous requirements proposed in the Draft Code are implemented, we expect many services that do not specifically target children may choose to age-gate services entirely to avoid being made subject to the new requirements.
- 'Best interests of the child' as a gating requirement. The Draft Code proposes to layer a 'best interests' assessment—drawn from the UN Convention on the Rights of the Child (UNCRC)—on top of existing APP 3 collection and APP 6 use/disclosure requirements. The 'best interests' test would materially narrow entities' flexibility in their collection, use and disclosure of personal information. Entities caught by the Draft Code would need to undertake this assessment in respect of all collection, use and disclosure activities, except in very limited circumstances (even where the individual has consented to a particular activity). There is no clarity on how an organisation would assess what is in the 'best interests' of a child.
- Privacy by default—strictly necessary PI only. The Draft Code proposes that default settings must ensure only 'strictly necessary' personal information is collected, used or disclosed. This would effectively ban not only default-on targeted advertising, but also all personalisation and recommender systems for children's accounts. All existing accounts or collections of end users who are (or may be) a child would need to be 'reset' on implementation of the Draft Code. This might have the counter-productive effect of removing personalisation processes which favour child-appropriate delivery of material.
- New consent architecture with a 12-month hard cap. The Draft Code proposes to hard-code an age of consent at 15, prescriptively expand on existing guidance on appropriate consent, create a novel 'double assent' requirement for under-15s in certain circumstances, and impose a maximum 12-month validity period on all consents for activities not strictly necessary for a service. These would be highly onerous to implement, and no guidance has been provided on how an organisation would ensure the appropriate individuals have provided the relevant forms of consent.
- Right to destruction (not just de-identification). Children, or parents on their behalf, would have a new right to require destruction of their personal information. Unlike the APP 11 requirement, de-identification is explicitly excluded and there is no proposal for a 'reasonable steps' qualifier. There are also no limits proposed on the circumstances in which destruction can be requested and very few circumstances in which the request may be rejected. In contrast to the rest of the Draft Code, there is currently no 'best interests of the child' qualifier for the entity in considering whether to destroy the information. It also potentially creates a right to require destruction of information owned or controlled by other individuals. This proposal risks creating a number of unintended consequences.
What is the Draft Code?
As we reported last year, the proposal to introduce a Children's Online Privacy Code was passed into law pursuant to the Privacy and Other Legislation Amendment Act 2024 (Cth) in response to growing concerns about children's exposure to large-scale data collection by online platforms.
In developing the Draft Code, the OAIC engaged with key stakeholders across government, international regulators, industry, academia and the community and consulted children, young people, parents, children's welfare organisations, the eSafety Commissioner and the National Children's Commissioner. The OAIC is now consulting on the Draft Code until 5 June 2026, with a view to registering the Draft Code by the required registration date of 10 December 2026.
Once registered, the Draft Code will impose binding obligations on APP entities to which it applies. A breach of the Draft Code will constitute an interference with privacy under section 13 of the Privacy Act, in the same manner as a breach of the APPs.
Breaking down the issues
We outline below each of the key proposed requirements in the Draft Code and the questions and issues these raise.
Scope and application of the Draft Code (ss 5 – 7)
Which entities would be regulated?
The Draft Code is proposed to apply to providers of:
- social media services;
- 'relevant electronic services' (eg email, chat services and online multiplayer gaming services); or
- 'designated internet services' (a broad category capturing services that enable end-users to access or deliver material via the internet, including file and photo storage services and certain generative AI services),
where the service is 'likely to be accessed by children' or 'primarily concerned with the activities of children' (excluding health services). These terms have the meanings given in the Online Safety Act.
What services of those entities are regulated?
The Draft Code applies only to specific services of an entity that meet this criteria, not all of its services.
The 'primarily concerned' test is designed to capture services that may not be used by children but are about or directed at children.
- The proposed scope and application provisions are broad and ambiguous, with no meaningful guidance on the threshold for when a service is 'likely to be accessed by children'.
- The combination of the 'likely to be accessed' test with its application to 'designated internet services' appears to mean that any B2C website (including e-commerce platforms and news or general information websites) is likely to be caught where its content, products or services could reasonably be accessed or acquired by someone under 18.
- Entities would need to assess whether their services are likely to be accessed by children and, if so, whether the additional compliance burdens of the Draft Code can be met. We anticipate that many services which do not have children as their core audience may respond by age-gating their platforms rather than complying with the Draft Code.
The Draft Code will clearly capture social media, online messaging, and many gaming and entertainment products used by children.
The Draft Code gives the following examples of services that are 'primarily concerned with the activities of children': apps that track early childhood development, family photo sharing apps, online school management systems that monitor student performance and internet-connected baby monitors.
Other businesses which may be captured include:
- online marketplaces or e-commerce platforms selling children's products;
- financial products and services likely to be accessed by children (eg children's savings products), including general products that can be opened for a child (eg deposit accounts);
- file and photo storage apps used by children or adults to store, view and share photos;
- many online websites with material which may be of interest to and accessed by any under 18s. It is important to note that the online activities of 17-year-olds are significantly different from those of a 10-year-old.
Use of age assurance (s 8)
- Before collecting personal information, entities would need to take reasonable steps to ascertain an end-user's age.
- The reasonableness of those steps is informed by the risk of harm that may arise from the entity's collection, use or disclosure in the ordinary course of providing the service.
- Personal information may be collected for this purpose, but only to the extent necessary to ascertain the user's age.
- This obligation would not apply if an entity complied with the Draft Code in respect of all end-users of relevant services regardless of age.
- Entities whose services would be subject to the Draft Code must decide whether to apply its requirements to all users or undertake age assurance to differentiate between child and adult users.
- The efficacy of age assurance technology has already been the subject of significant debate following the introduction of the 'social media ban'. The OAIC has issued 'Privacy guidance on age assurance technologies', covering methods such as age inference, estimation, verification, self-declaration and parental attestation.
- The requirement to assess risk and determine appropriate steps for different services is uncertain and highly contextual, which may lead entities to adopt 'high bar' assurance mechanisms as a default rather than risk their steps being deemed 'unreasonable'. While the OAIC has criticised entities for setting high access bars and over-collecting personal or sensitive information for age assurance purposes,1 the Draft Code in its current form could exacerbate that trend.
A subscription information content service (eg a subscription to a news website) that is likely to be accessed by children will need to determine what 'reasonable steps' are required at sign-up to confirm whether a subscriber is under 18.
Default privacy settings (s 9)
- Entities must implement technical and organisational measures to ensure that, by default, only personal information that is 'strictly necessary' to provide the service is collected, used or disclosed.
- Children must be able to control collection, use or disclosure of personal information that is not strictly necessary, via clear, simple and easily accessible means.
- This applies to both existing and new accounts from commencement of the Draft Code.
- This would effectively prohibit default-on collection for targeted advertising, personalisation and recommendation systems. The use of personal information for service improvement must also not be enabled by default.
- This would have profound implications for the business models of many online services with a substantial proportion of child users or that are focused on children, particularly those funded through advertising or reliant on algorithmic content recommendation.
A streaming service likely to be accessed by children collects personal information to personalise viewing recommendations. It would need to reconfigure default information collection for children's accounts (or children's profiles on an adult's account) so that only information strictly necessary for provision of the service is collected automatically. This may have the unintended consequence of reducing age-appropriate content recommendation.
'Best interests of the child' concept and its impact (ss 10 – 11)
Collection, use or disclosure generally
- Entities may only collect a child's personal information if the collection is consistent with the child's best interests and may only use or disclose it where consent has been obtained and the use or disclosure is consistent with the child's best interests.
- Standard exceptions are retained (including for compliance with law and permitted general situations) but, importantly, the Draft Code effectively removes a 'reasonable expectations' basis for secondary use and disclosure.
Direct marketing
- The Draft Code overlays additional best-interests and consent requirements on APP 7 (direct marketing) and prohibits direct marketing to children unless the information was obtained directly from the child.
- The Draft Code also imposes additional opt-out requirements that go beyond standard APP 7 obligations.
What is 'best interests'?
The Explanatory Statement refers to Article 3 of the UNCRC and identifies considerations for the assessment, including: mental or physical impacts, impact on the child's development, the ability to develop and express views and identities, and the effect on freedom of association, play, leisure or participation.
- The 'best interests' criterion would create a significant additional hurdle for the collection, use and disclosure of children's personal information, and how it will apply in practice is uncertain.
- 'Best interests' is likely to represent a narrow set of permissible activities, as it:
- adds a substantive gate on top of APP 3 collection tests—even if collection is 'reasonably necessary' under APP 3.2 or meets APP 3.3(a), it must also be consistent with the child's best interests; and
- materially narrows APP 6 flexibility—removing reliance on APP 6.1(b) 'reasonable expectations' for children's information.
- The proposed drafting requires both consent and best interests to be satisfied even for use and disclosure for the primary purpose of collection under APP 6.1.
- It is difficult to envisage when direct marketing could satisfy a 'best interests' test. While the Explanatory Statement notes that the test 'does not mean that an entity cannot pursue its own commercial or other interests', entities will need to assess (and likely document and demonstrate) how the collection, use and disclosure is aligned with the child's best interests.
- The interaction with the Spam Act is unclear, though the implication appears to be that an entity could send a marketing email in accordance with the Spam Act, but would not be able to personalise it for a child unless it complied with the Draft Code.
An online clothing store for teens wants to use personal information about its 16-year-old customers (such as purchase and browsing history) for a direct marketing campaign. The store will need to do all of the following:
- demonstrate that collection of the information is consistent with the child's best interests;
- obtain consent to collect and use/disclose information for direct marketing (see Novel asset / noctice requirements below for further details on requirements for obtaining consent);
- assess that such marketing is in the child's best interests (and document such assessment); and
- update its marketing opt-out processes.
New consent requirements (ss 12 – 19, 21)
Age threshold
- Consent may only be given by a child if they are at least 15. Children under 15 require parental consent.
- If parental consent is required, the entity must take reasonable steps to confirm the person who gives consent has parental responsibility.
Consent standards
- Consent must be voluntary, informed, current, specific and unambiguous—omission-based consent (eg pre-ticked boxes or deemed consent from continued use) is expressly prohibited.
- Bundled consent is expressly deemed non-voluntary and consent obtained by manipulative, deceptive or misleading practices is invalid.
Consent period
- Consents obtained from children can only be relied on for a specified period, not exceeding 12 months.
While some of these requirements codify existing OAIC guidance and aspects of the proposed Tranche 2 Privacy Act reforms, the consent framework in the Draft Code also goes well beyond this:
- the 12-month time limit means that no consent can be tied to a period of use of a service, requiring entities to operationalise a process to periodically refresh consent.
- some examples in the Explanatory Statement of prohibited practices (such as 'confirm shaming') appear to extend the consent restrictions further, prohibiting marketing language designed to encourage certain forms of behaviour.
A parent of a child under 15 consents to the child setting up an online gaming profile. The gaming company must:
- take steps to ensure the parent does in fact have parental responsibility for that child; and
- only rely on that consent for a maximum of 12 months before re-seeking consent.
Novel assent / notice requirements (ss 20, 33)
Several novel notice and assent requirements are introduced to offer greater transparency for children.
Assent
- Where a child under 15 wishes to enable certain higher-risk handling (eg sensitive information collection, secondary use/disclosure, or direct marketing), the entity must seek the child's 'assent' to the handling and to contacting a parent/guardian for consent.
- Assent must be specific and requested via an age-appropriate notice with prescribed content including purpose, period, consequences, withdrawal process and recipient information.
- The assent/consent reliance period must be reasonably appropriate and, in any case, must not exceed 12 months.
Notice of parental monitoring
Where a service enables parental control or monitoring functions (including geolocation), it must provide an age-appropriate notice to the end user.
- The concept of 'assent' is novel and appears to function effectively as a consent, even though the Draft Code otherwise provides that persons under 15 cannot grant consent.
- The 'double assent' mechanism would be operationally highly complex to implement.
- The Explanatory Statement states that a child's assent is not required where a person with parental responsibility has already provided consent for a specific purpose, but this exception is not expressly referenced in the Draft Code.
- It is also unclear from the proposals how an organisation would be able to assure themselves that the parental consent has been properly given by the correct person.
A child under the age of 15 wishes to enable the collection of sensitive information on an online service. The online service must seek the child's assent to the handling and to contacting their parent for the purpose of obtaining their consent.
Additional compliance burdens: notices and compliance activities (ss 22 – 25, 37 – 40)
Notices and policies
The Draft Code contains multiple notice and transparency requirements in addition to existing APP obligations, including:
- requirements for 'age appropriate' notices (baselined at a 10–12-year-old if the age of a user is unclear);
- specific notice requirements in addition to existing APP 5 mandatory requirements;
- requirements to provide notices to both the child and parents where parental consent is obtained for a child under 15;
- making available 'child-directed' versions of an entity's privacy policy where the services are likely to be accessed by children (which would need to be a separate version given the specificity of the requirements for such policy).
Annual reviews, PIAs and training
- The Draft Code imposes a mandatory annual review of internal practices, procedures and systems in place to comply with the APPs and the Draft Code, including mandatory record-keeping obligations of such reviews and provision of records to OAIC on request.
- Entities must conduct privacy impact assessments before providing new services or activities likely to be accessed by, or primarily concerned with, children, or where they propose to adopt new or changed information-handling practices likely to have 'a significant impact on the privacy of children'. PIA requirements are prescriptive, and entities must maintain and publish online a register of PIAs.
- The Draft Code imposes additional specific training obligations.
Notices and policies
The requirement to provide additional notices and policies would apply to standard e-commerce or online services not specifically directed at children. It is questionable whether the benefits of such notices/policies outweigh the compliance burden in these contexts.
Annual reviews and PIAs
The additional compliance activities are materially onerous:
- As drafted, the annual review obligation captures a requirement to review compliance with the APPs, not just the Draft Code, becoming a de-facto annual review process for an entire privacy compliance program, not just the Draft Code.
- The PIA requirement is potentially very broad, requiring any new service or activity likely to be accessed by children to undertake a PIA. This does not apply any threshold of impact (in addition to requiring a PIA for any change which may have a 'significant impact on the privacy of children'). Given the scope of application of the Draft Code, businesses not directed at children are likely to be caught by these PIA obligations.
- The obligation to publish a register of PIAs is unusual and its purpose is unclear. This requirement is not explicitly limited to PIAs conducted for the purposes of the Draft Code, but may extend to all PIAs undertaken by the entity. PIAs often assess confidential matters and may not result in a project proceeding. This obligation would require the register to include PIAs for confidential projects, without any corresponding privacy benefit to individuals.
An e-commerce platform that likely has customers under 18 and collects their personal information, would need to:
- draft additional bespoke privacy policies and notices directed at children;
- assess whether any change in its information-handling practice as part of a confidential internal business restructure project will 'significantly impact' the privacy of customers under 18; and
- publish any such PIA to a public register, even where the relevant proposed change to information handling does not proceed.
New information access and request rights (ss 28, 30 – 31)
The Draft Code provides additional rights relating to access to personal information, including:
- requiring entities to respond to an APP 12 access request by a child in a format that is simple, easy-to-understand, age-appropriate and enables meaningful understanding of what is held;
- providing a right for a child or parent to request information about an entity's handling of their personal information (including categories of information, purposes, recipients, retention periods and automated decision-making or profiling).
- The obligation to respond to an access request in an 'age-appropriate format' is likely to present significant operational challenges. For instance, it is unclear how a request involving a large volume of information or complex data can be presented in such a manner.
- The right to request additional information on handling appears duplicative of other transparency obligations (such as the requirement to have a privacy policy or provide a collection notice), and it is unclear whether it will provide any real benefit to the recipient, particularly given its operational burden.
A child requests information from the provider of their favourite gaming app about the information held and how it is handled. The provider already has an APP-compliant privacy policy to which it would normally direct customers, but must respond to the request in an 'age-appropriate' form.
New destruction obligations (s 32)
- The Draft Code creates a new right for a child (or parent/guardian) to request destruction of personal information held by an entity.
- The entity must destroy the information unless an exception applies (eg where destruction would be unlawful, or there is a serious threat to life, health or safety, or existing or anticipated legal proceedings).
- Unlike APP 11, this is a strict obligation to destroy (rather than a 'reasonable steps' obligation) and does not contemplate de-identification as an alternative.
- The proposed destruction right is extremely onerous.
- The stated rationale for departing from the APP 11 approach of 'reasonable steps' and permitting de-identification is that greater protection should be owed to children's information. However, there is no balancing consideration of the operational impact and practicality (including, for example, where information is held in backups and not readily accessible).
- It is also unclear how this right would be implemented in the following contexts:
- where children's information is comingled with other data;
- information held in the context of services acquired by adults or third parties rather than the child (eg a photo sharing service where the end user is a parent), and/or where third parties hold rights (eg copyright) in the personal information;
- where the request is made by a very young child (eg a 3-year-old), does the right require compliance with a destruction request from a child at any age?
- Unlike other parts of the Draft Code there is no 'best interests' test the entity must consider prior to actioning the request. There is also no recognition that a child under a certain age may not have the requisite capacity to make a significant decision of this nature. This contradicts the position adopted by the Draft Code in relation to the ability to provide consent not arising until a child is at least 15.
A child may be able to request the destruction of personal information held on a parent's family photo sharing service. It is unclear what the implications are where the photos do not only contain images of the child (eg a photo has multiple subjects).
Actions you can take now
The OAIC is seeking feedback through the consultation process. Given the significant impact on entities, as well as the uncertainty raised by several of the proposed requirements, industry engagement during the consultation period is critical. If your organisation would like to submit feedback, consider stress-testing the impact of the Exposure Draft on your business as a means to substantiate any feedback you provide, including by:
- Assessing scope exposure to determine whether any of your organisation's online services are 'likely to be accessed by children' or 'primarily concerned with the activities of children'.
- Assessing the feasibility of age-gating your services, and how this might practically be implemented.
- Considering how 'best interests' would be assessed against your current information-handling practices and how such a change would impact the use of personal information in your business.
- Reviewing the impact of The Draft Code on consent and notice frameworks through which you currently obtain consent and provide notices.
- Considering implication of destruction requests and identifying practical concerns with the destruction obligation and how it would be operationalised.


