OAIC Releases Draft of Children’s Online Privacy Code Drawing fro


On March 31, 2026, the Office of the Australian Information Commissioner (OAIC) released its much-anticipated Exposure Draft of the Privacy (Children’s Online Privacy) Code (Draft Code). It introduces a number of novel concepts in addition to drawing from the UK Age-Appropriate Design Code (UK AADC), in an effort to “uplift privacy practices across entities more broadly” and keep children’s privacy safe in Australia. This posts breaks down how it could impact businesses.

Background

  • The Draft Code would apply to providers of designated internet services, relevant electronic services and social media services (Online Service) but only if the Online Service is primarily concerned with the activities of children or likely to be accessed by children.
  • The Draft Code, therefore, intends to cover any online spaces that may still process large amounts of children’s personal information. This will capture Online Services which are not directed to children, such as update sharing apps used by daycares.
  • The Draft Code may apply to organisations who provide an Online Service, even where those organisations are not typically considered as technology platforms. For example, organisations who offer a subscription service to children (in addition to their broader offering) will need to ensure that that service complies with the Code.
  • The key features of the Draft Code, as are now well publicised and discussed below, are:
    • age assurance for all users;
    • assessing and building into an Online Service’s data practices the “best interests” of a child;
    • enhanced consent requirements;
    • unprecedented obligations when direct marketing to children; and
    • introducing a right to delete for children.
  • The Code is open for public consultation until 5 June 2026, with the opportunity for participants to attend virtual roundtables between 31 May and 5 June. While the Code must be registered by 10 December 2026, no commencement date has been announced. Participants are welcome to propose an appropriate commencement date (and transition period) in their submissions.
  • It is clear that the OAIC has been ambitious in both the scope and content of the Draft Code. We strongly encourage clients to consider the effect of the Draft Code – including those features which we have flagged as possible Issues for Implementation below – and make submissions on areas which may be of concern.
  • Many of the proposed inclusions in the Code would change the application of the Australian Privacy Principles (APP) and could hint at broader changes to come – which is another good reason to make your views known.

What is the Code?

  • The Code is intended to sit under the Privacy Act 1988 (Cth) (Privacy Act) as a form of delegated legislation.
  • Once registered, the Code will operate in addition to the requirements of the APPs. Organisations who are caught by the Code will demonstrate their compliance with the APPs (as they relate to children) through adherence to the Code.

Key Features

Age Assurance

Key Insights
  • Under the Draft Code, before personal information is collected from an individual, an Online Service must take reasonable steps to ascertain the end user’s age (Age Assurance).
  • Like the UK AADC, the form of Age Assurance is required to align with the “risk of harm” arising from the processing of personal information:
    • If an Online Service collects a very small amount of information from children, it may be that that Service can impose low-friction Age Assurance.
    • On the other hand, if an entity were to collect significant personal information about children and use it for high-risk purposes (e.g. targeting advertising), the more robust the Age Assurance needs to be.
  • It is open to the Online Service not to undertake Age Assurance if they apply the protections in the Code to all end-users, but this is unlikely to be a practical option for Online Services with a mixed audience.
  • Age assurance applies differently and more extensively under the Draft Code than under the Phase 2 Online Safety Codes, the social media pause or other countries’ online safety laws. This is because:
    • It applies before any personal information is collected from an end-user – regardless of whether that end-user has an account with the Online Service or not.
    • Unlike the online safety pause, Age Assurance involves ascertaining an end-user’s age. Presumably, this is because the Draft Code applies differently depending on a child’s age (e.g., some obligations only apply if a child is under 15). However, it means that, as drafted, an Online Service must ascertain an adult end-user’s age. This is disproportionate and contrary to the UK AADC, which requires instead that online services understand the “age range of children likely to access the service”.
    • Unlike Singapore’s Code of Practice for Online Safety for App Distribution Services, the OAIC has decided that it will not be the responsibility of a third party (like an app store provider) to conduct age assurance.
Issues for Implementation
  • Despite the slated effort to ensure consistency, the Draft Code is more prescriptive than the UK AADC: it requires entities to review and update, at least annually, its privacy processes (including for Age Assurance) and imposes strict limits on retention of facial age estimation information, which is seen as a category of sensitive information. This places a higher burden for businesses and organisations captured by the Code.
  • Further, unlike other Australian laws, Age Assurance will extend to registered and unregistered users. This will be a challenge for many Online Services, who will have to find a proportionate way of collecting and retaining such information.
  • Where an Online Service caters to differently-aged children and to both registered and unregistered users, there is real doubt on what “reasonable steps” will look like, and it may be practical for that Service to impose the highest possible threshold, regardless of risk.

Privacy by default

Key Insights
  • Under the Draft Code, an Online Service must implement measures which, by default, ensure that the entity only collects, uses or discloses personal information as is strictly necessary to provide their service.
  • Children must be able to control any additional collection, use or disclosure of personal information that is not strictly necessary, via accessible and clear means.
Issues for Implementation
  • This proposed requirement is closely aligned with the UK AADC (which requires “high privacy by default”) but, unlike the GDPR, there is no express requirement for data minimisation in the Privacy Act. There is also no requirement for data privacy by default. This new obligation therefore sits as a wholly new requirement, and operates quite differently from APP 3, which imposes a reasonably lenient condition that personal information must be “reasonably necessary” before it is collected.
  • As drafted, the Draft Code would apply broadly, including to Online Services which are likely to be accessed by children. Many older children who access these Services may appreciate having a tailored, targeted online experience. They will no longer be able to get that – at least by default.

Best Interests of a Child

Key Insights
  • The Draft Code provides that children’s personal information must be collected, used and disclosed consistent with the ‘best interests of the child’ (Best Interests). The principle has been taken from the United Nations Convention on the Rights of the Child and the Explanatory Statement to the Draft Code suggests it will involve consideration of factors like:
    • the nature and extent of child exploitation risks;
    • the likely mental or physical impacts on the child; and
    • the extent to which children’s abilities may be affected.
  • The Best Interests test applies to almost all activities where personal information is handled about a child. The way that it has been framed in the Draft Code contrasts with the UK AADC, which requires that an online service consider instead “best interests of child users in…[the] design of an online service”.
  • In particular, the test applies in addition to existing permissions for using personal information (including where for a related secondary purpose), and to prescriptive requirements for obtaining a children’s consent under the Draft Code (including a requirement to refresh consent every 12 months).
  • In this way, it operates similarly to the “fair and reasonable” test that the OAIC has frequently espoused and which the Government accepted in principle in their Response to the Privacy Act Review (Government Review)), which is equally broad and would apply on top of existing obligations under the APPs. 
Issues for Implementation
  • Best Interests under the Draft Code go beyond what the Australian Government agreed to in principle in 2023, creating onerous implementation challenges for entities.
  • Instead of treating “best interests” of children as one factor that entities must have regard to when considering whether collection, use or disclosure is ‘fair and reasonable’ (as proposed in the Government Review), the Draft Code adopts an extremely strict approach by imposing it as a threshold condition for all collection, use and disclosure. In particular, it removes the flexibility of an organisation being able to rely on use/disclosure of personal information in a way which is consistent with an individual’s “reasonable expectations” and related to the primary purpose of collection.
  • Even though the Draft Code does, at times, recognise that different age ranges have different interests and needs, Best Interests is another example of where this is not made out:
    • The obligation to handle personal information consistently with a child’s best interests does not make any reference to different age ranges – or, as flagged above, the different levels of interaction that a child user may have with an Online Service.
    • Acting consistently with a 17‑year‑old’s best interests is quite different from acting consistently with a 13‑year‑old’s best interests, yet the Draft Code does not provide detailed guidance on how entities should navigate these differences in practice.
Direct marketing
  • To compliantly market to a child, an Online Service must:
    • obtain personal information directly from the child;
    • obtain consent, which meets the higher thresholds discussed below (in Consent);
    • ensure that such use and disclosure of personal information is consistent with a child’s best interests; and
    • offer a simple means to opt out of direct marketing.
  • Similar to Ireland’s Fundamentals for Children-Oriented Approach to Data Processing, the onus now lies with the Online Service to show that the targeting of advertising is in that child’s best interests.
Issues for Implementation
  • There is real doubt around the activities to which these provisions apply. In the Government Review, the Attorney-General draws a distinction between “direct marketing” and “targeted advertising”. However, the OAIC has previously indicated that “direct marketing” includes “online advertising”, such as displaying an advertisement based on cookie-based data.
  • The Explanatory Statement refers to “direct marketing newsletters” (suggesting a more conventional approach), but we will need clarity on what is meant and how these new rules are intended to operate.
  • It is difficult to see where an Online Service can ever successfully argue that direct marketing is in a child’s best interests, as the nature of such marketing is always intended to promote that Service’s brand over anything else. While the Explanatory Statement identifies that the Best Interests test does not prevent an entity from “pursuing its own commercial…interests”, the requirement that these not be incompatible with the best interests of a child makes this tricky.

Consent

Key Insights
  • The Draft Code sets the age of consent by a child to the collection, use or disclosure of their personal information at 15 years old, and requires parental consent for children under that age (with some exceptions).
  • There is acknowledgment that an Online Service must take reasonable steps to confirm that a person who gives consent holds parental responsibility, but still no clear indication of how to do so without collecting significant personal information.
  • When obtaining parental consent for children under 15, an Online Service must still issue a “consent notice” to children. In some circumstances (including if a child under 15 enables direct marketing), an Online Service must obtain both parental consent and seek a child’s “assent”. This is likely to be difficult to put into place. 
  • Finally, the requirements for valid consent under the Draft Code exceed the definition of “consent” under the Privacy Act. According to the Draft Code, children’s consent must be:
    • voluntary (which means not bundled and not obtained by manipulative, deceptive or misleading practices);
    • informed, with the specific information required set out in the Draft Code;
    • current and refreshed every 12 months;
    • specific; and
    • able to be withdrawn.
Issues for Implementation
  • The requirement for 12-month consent validity, mandatory child friendly notices and, in some cases, “double consent” (through child assent) introduces new concepts that many Online Services are not currently equipped to support. Online Services with a mixed audience will likely have to quarantine consents for children, to avoid the higher requirements for children infecting all of their consents.
  • There remains a question of whether parental assent is necessary, given children have varying levels of comprehension. There are further concerns about the effectiveness of relying on parental assent to protect children’s online data as this assumes that parents are cognisant of and are able to conceptualise all the varying forms of privacy dangers that children may be subject to.
  • Finally, like the social media pause, the requirements of consent under the Draft Code are higher than those under the APPs generally. Query also whether consent which is “manipulative, deceptive or misleading” re-has the potential to overlap or, at worst, contradict proposed amends to the Australian Consumer Law which seek to regulate just that.

Notice and consent fatigue

Key Insights
  • On our count, children may, in certain circumstances, have six documents available to them in relation to an Online Service’s privacy practices:
    • APP 1 website privacy policy;
    • APP 1 website privacy policy (children’s version);
    • APP 5 privacy notice;
    • age appropriate consent notice;
    • child-specific information about inquiries and complaints; and
    • in some cases, anything required to obtain a child’s assent.
Issues for Implementation
  • Especially for children, the above is a lot of information and seems to contravene the Draft Code’s focus on clarity, simplicity and ease. We query whether more information necessarily helps children understand how their personal information is handled.
  • Online Services will need to ensure that the information contained in each of these documents is consistent and does not contradict anything said elsewhere.

Right to delete

Key Insights
  • The Draft Code introduces a broad right for children to request the destruction of their personal information.
  • There are exceptions, but they are very limited (such as where the Online Service is required under Australian law to retain such personal information).
  • Even if it is only directed to children, it is likely that this will influence broader privacy practice in Australia. Currently, the obligation to destroy or de-identify personal information is limited to “reasonable steps” only under APP 11.
Issues for Implementation
  • The Productivity Commission has rejected the proposed Tranche 2 right to erasure, warning that reforms “risk entrenching existing problems” and may “exacerbate the regulatory burden.”
  • While this is a significant uplift to Australian privacy law, the Commission’s concerns highlight uncertainty about whether such rights are proportionate or scalable for most entities.
  • There are also questions as to how effective a right to delete can be in preventing the aggregation and harvesting of children data. For instance, a right to delete alone cannot necessarily protect a child against material that has been screenshotted and shared by third-party end-users.

All of these terms are defined under the Online Safety Act 2021 (Cth).

The author thanks and acknowledges Navanitha Gajendran for research and editorial support.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *