Taylor Swift’s deepfakes spark calls for criminalisation of deepfake pornography

The rapid online spread of deepfake pornographic images of Taylor Swift has generated renewed calls to criminalise the practice.

Computer keyboard with red Deepfake button key. Deepfake dangers online.

The rapid spread of explicit deepfake images of Taylor Swift online has prompted calls for the criminalisation of this practice in the US. Both Democratic and Republican politicians in the US are advocating for legislation to combat deepfake pornography, with some states already having their laws in place.

In May 2023, Democratic congressman Joseph Morelle introduced the Preventing Deepfakes of Intimate Images Act, which aims to make it illegal to share non-consensual deepfake pornography. Morelle highlighted the emotional, financial, and reputational harm caused by such images, particularly impacting women. Republican congressman Tom Kean Jr has supported Morelle’s bill and introduced the AI Labeling Act, which calls for clearly labeling all AI-generated content.

Deepfake technology primarily targets women in a sexually exploitative manner, as evidenced by a 2019 study that found 96% of deepfake video content was non-consensual pornographic material. The issue has worsened with the advancement of AI technology, enabling the creation of compelling images. Taylor Swift’s case has brought attention to the problem, as her explicit deepfake images have been widely spread on social media platforms such as Telegram and X.

Why does it matter?

Platforms like X actively remove identified deepfake images and take action against the accounts responsible. Deepfake technology is not limited to targeting women, as it has also been used to imitate high-profile men, including politicians and artists and is expected to cause problems during the political elections of 2024 worldwide.

However, the majority of deepfake content is still directed at women in a sexually exploitative manner, and countries are taking steps to mitigate it. For example, the UK has already made non-consensual deepfake pornography illegal, providing a potential model for other countries.