Australia has entered a new phase of digital regulation. Through the Online Safety Act 2021 and the associated Online Safety Codes and Standards, the government has introduced a framework designed to reduce exposure to harmful online content, particularly for children and young people. These measures are administered by the eSafety Commissioner, Australia's independent online safety regulator.
While the codes are primarily targeted at large online services such as social media platforms, hosting providers and search engines, the broader direction is clear. Organisations operating digital platforms will increasingly be expected to demonstrate how safety, moderation and age-appropriate access are managed.
What is surprising, however, is how little attention these changes have received outside regulatory circles.
The regulatory framework in simple terms
The Online Safety Codes and Standards sit under the Online Safety Act and establish obligations for online service providers to address illegal and harmful content.
The framework distinguishes between two broad categories:
- Unlawful Material Codes and Standards dealing with seriously harmful content such as child exploitation or terrorist material
- Age-Restricted Material Codes designed to prevent children from accessing harmful but legal material such as pornography, extreme violence or self-harm content
Compliance deadlines are already underway.
- The Unlawful Material Codes became mandatory in December 2024.
- The Age-Restricted Material Codes begin applying progressively through 2025 and 2026, depending on the type of online service.
The regulator has also issued detailed guidance to help platforms interpret and comply with the requirements.
Reference: Online Safety Codes and Standards Regulatory Guidance (PDF)
Age assurance and the under-16 social media rule
One of the most significant developments is the Social Media Minimum Age requirement, which obliges age-restricted platforms to take reasonable steps to prevent Australians under 16 from creating or maintaining accounts.
The obligation formally takes effect in December 2025.
Under the guidance, platforms must deploy systems that can:
- detect under-age users
- prevent account creation
- deactivate existing accounts if necessary
- mitigate attempts to bypass controls
These "reasonable steps" can include a mix of technology, processes and policies.
The section that raises the most practical questions
One part of the regulatory guidance stands out.
The document states that service providers should take "reasonable steps to ensure age assurance measures cannot be circumvented."
Examples suggested in the guidance include:
- liveness checks in identity verification
- device telemetry and behavioural signals
- VPN detection and IP intelligence tools
- geolocation consistency checks
This raises an important practical question for organisations and platform providers - What exactly counts as "reasonable"?
For global platforms with significant resources, deploying layered detection systems may be achievable. For smaller organisations or niche digital platforms, the interpretation of reasonable could be very different.
The legislation intentionally leaves room for judgement, but that ambiguity may create uncertainty for organisations trying to comply.
What organisations actually need to do
Even if your organisation is not a large social platform, the direction of regulation is clear. Digital services are expected to take responsibility for the safety implications of how their systems operate.
At a minimum, organisations operating digital platforms should consider the following.
- Conduct a platform risk assessment
Understand how users interact with your service and where exposure to harmful content or behaviour could occur. - Implement reporting and moderation processes
Platforms should provide clear pathways for users to report harmful content and have internal processes to review and respond. - Introduce appropriate age safeguards
Where services are likely to attract younger audiences, organisations may need to consider age-appropriate access controls. - Document policies and decision processes
Platforms should be able to demonstrate how safety risks are identified and addressed. - Monitor attempts to bypass safeguards
If controls are introduced, organisations should consider how they might be circumvented and what mitigation steps are reasonable.
A practical checklist for digital platforms
If you operate a digital platform in Australia, it may be worth reviewing the following questions.
- Does your platform allow user generated content or communication?
- Are children or young people likely to use the service?
- Are harmful content risks documented and monitored?
- Are reporting pathways clear and easy to use?
- Are moderation responsibilities clearly defined?
- Are identity or age checks required in some circumstances?
Even if the current codes do not directly apply to your service, these areas are increasingly becoming standard expectations.
The bigger issue: safety, privacy and practicality
The goals of the Online Safety Act are broadly supported. Few people would argue against protecting children from genuinely harmful content.
However, the implementation raises legitimate questions.
If platforms increasingly rely on identity verification, behavioural analysis and device tracking to determine age or legitimacy, organisations must also think carefully about privacy, security and proportionality.
For example:
- Where is identity data stored?
- How long is it retained?
- Who has access to it?
- What happens if that data is compromised?
These questions deserve as much attention as the safety objectives themselves.
What organisations should do next
For most organisations, the immediate task is not deploying complex identity systems.
Instead, it is understanding the regulatory direction and ensuring that digital platforms are designed with safety, transparency and accountability in mind.
That means:
- reviewing moderation processes
- clarifying reporting pathways
- assessing age risk where relevant
- documenting policies and responsibilities
The Online Safety Codes and Standards represent an important shift in how digital platforms are expected to operate in Australia.
The next challenge will be how consistently the concept of reasonable steps is interpreted in practice.



