New Age-Restricted Material Codes have begun to be enforced in Australia, requiring online platforms to introduce stronger protections to prevent children from accessing harmful digital content.
The rules apply across a wide range of services, including social media, app stores, gaming platforms, search engines, pornography websites, and AI chatbots.
Under the framework, companies must implement age-assurance systems before allowing access to content involving pornography, high-impact violence, self-harm material, or other age-restricted topics.
These measures also extend to AI companions and chatbots, which must prevent sexually explicit or self-harm-related conversations with minors.
The rules form part of Australia’s broader online safety framework overseen by the eSafety Commissioner, which will monitor compliance and enforce the codes.
Companies that fail to comply may face penalties of up to $49.5 million per breach.
The policy aims to shift responsibility toward technology companies by requiring them to build protections directly into their platforms.
Officials in Australia argue the measures mirror long-standing offline safeguards designed to prevent children from accessing adult environments or harmful material.
Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!
