Facebook announced new measures dedicated to identifying and removing non-consensual intimate images (also known as revenge porn) shared via the social media platform, as well as to supporting the victims of such abuse. The company will be using a new detection technology, powered by machine learning and artificial intelligence (AI), to 'proactively detect near-nude images or videos that are shared without permission on Facebook and Instagram'. Once identified by the AI tool, the content is reviewed by a member of Facebook's Community Operations team, who will decide whether to remove the image or the video. The removal will in most cases also be accompanied by disabling the account from which the content was shared without permission. Facebook has also launched the Not Without My Consent victim-support hub, for victims of revenge porn to be able to look for organisations and resources to support them.