EU parliament agrees on draft law to combat online child sexual abuse material

The EU parliament agreed on a draft legislation mandating internet platforms to detect and report dissemination of child sexual abuse material.

EU Flag

The main political associations in the European Parliament reached an agreement on a draft legilsation to prevent the dissemination of child sexual abuse material (CSAM) on the internet. Initially contentious, the initial draft included measures allowing judicial authorities to request an investigation of private conversations on platforms such as WhatsApp or Gmail for questionable information. The proposed legislation mandates internet platforms to detect and report such material, and the file is anticipated to be adopted by the European Parliament’s Committee on Civil Liberties, Justice and Home Affairs on 13 November. 

The current draft focuses on establishing an EU Centre, which will serve as a central hub of expertise in fighting child sexual abuse within the EU. It will be granted significant legal capacity under the laws of each member state.

Additionally, the EU Centre will have the authority to search for CSAM in publicly accessible content, resembling the technology used by web crawlers for search engines. It will operate independently and be overseen by a Fundamental Rights Officer. Europol will be able to request information from the Centre through secure communication channels, subject to case-by-case approval based on explicit and justified requests.

Experts in digital rights and apps like WhatsApp and Signal have raised concerns that detection orders would compromise end-to-end encryption, thus weakening user privacy and data security. Technologies that detect child sexual abuse material (CSAM) must undergo independent audits to assess their performance. This applies to both technologies provided by the EU Centre and those developed by the service providers using them.

The draft law also addresses the role of app stores in ensuring age verification for apps and platforms. App store providers designated under the Digital Markets Act must make reasonable efforts to prevent children from accessing apps intended for adults. MEPs outline age verification systems and are mandatory only for porn platforms.

Why does it matter?

The draft regulation addresses a pressing issue, as it mandates internet platforms to actively detect and report CSAM, aiming to create a safer online environment for children. By setting clear guidelines and establishing protocols for handling such sensitive material, the draft legislation shows a strong commitment to safeguarding the well-being of young internet users. This draft regulation marks a pivotal step towards a more secure digital space for children across the EU.