UK plans ban on deepfake AI nudification apps
Proposed UK legislation would criminalise AI nudification apps, strengthening protections against deepfake abuse and increasing pressure on developers and platforms to prevent image-based exploitation.
Britain plans to ban AI-nudification apps that digitally remove clothing from images. Creating or supplying these tools would become illegal under new proposals.
The offence would build on existing UK laws covering non-consensual sexual deepfakes and intimate image abuse. Technology Secretary Liz Kendall said developers and distributors would face harsh penalties.
Experts warn that nudification apps cause serious harm, mainly when used to create child sexual abuse material. Children’s Commissioner Dame Rachel de Souza has called for a total ban on the technology.
Child protection charities welcomed the move but want more decisive action from tech firms. The government said it would work with companies to stop children from creating or sharing nude images.
Would you like to learn more about AI, tech, and digital diplomacy? If so, ask our Diplo chatbot!
