Australian regulators have released new guidance ahead of the introduction of industry codes designed to protect children from exposure to harmful online material.
The Age Restricted Material Codes will apply to a wide range of online services, including app stores, social platforms, equipment providers, pornography sites and generative AI services, with the first tranche beginning on 27 December.
The rules require search engines to blur image results involving pornography or extreme violence to reduce accidental exposure among young users.
Search services must also redirect people seeking information related to suicide, self-harm or eating disorders to professional mental health support instead of allowing harmful spirals to unfold.
eSafety argues that many children unintentionally encounter disturbing material at very young ages, often through search results that act as gateways rather than deliberate choices.
The guidance emphasises that adults will still be able to access unblurred material by clicking through, and there is no requirement for Australians to log in or identify themselves before searching.
eSafety maintains that the priority lies in shielding children from images and videos they cannot cognitively process or forget once they have seen them.
These codes will operate alongside existing standards that tackle unlawful content and will complement new minimum age requirements for social media, which are set to begin in mid-December.
Authorities in Australia consider the reforms essential for reducing preventable harm and guiding vulnerable users towards appropriate support services.
Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!
