Australia’s esafety commissioner unswayed by Apple’s explanation for abandoning child abuse scanning technology

The move was criticised by child safety groups and regulators, including Australia’s eSafety commissioner.

 Architecture, Building, Logo, Office Building, Symbol

Australia’s eSafety commissioner is unswayed by Apple’s explanation for abandoning technology development to scan its iCloud storage services for images of child abuse.

In response to calls from the child safety group Heat Initiative, Apple had outlined its concerns about the potential misuse of the technology. This technology would have allowed Apple to scan images before uploading them to iCloud, checking them against a database of known child abuse imagery. If there was a match, the photos would be reviewed by Apple staff and reported to authorities.

However, Apple decided against the technology, citing its potential risks to user security and privacy and concerns about mass surveillance and unintended consequences. This decision was heavily criticised by child safety groups and regulators, including Australia’s eSafety commissioner, who called it a ‘major setback.’ In May, the eSafety commissioner declined to register industry codes that did not include measures to detect and prevent the distribution of child abuse material.

Why does it matter?

As a result, the eSafety commissioner is now developing a mandatory standard that will likely require technology similar to what Apple has been growing. The standards are still in development, and once implemented, Apple will be required to take appropriate measures to address the risk of child sexual abuse material in its services in Australia.