Apple will not scan iCloud photos for CSAM

Apple has announced that it has withdrawn its plans to scan photos on users’ iCloud for child sexual abuse material (CSAM). Following criticism from civil society and expert communities, in September 2021 Apple paused the rollout of the relevant feature. Now, the company will focus on its Communication Safety feature announced in August 2021, which allows parents and caregivers to opt into protections on the iCloud. Apple is also developing a new feature to detect nudity in videos sent through Messages and will expand this to its other communication applications.