Spanish court acknowledges the hidden trauma of content moderators

A Spanish court identified a Meta content moderator’s psychological distress as a work-related injury and held Meta’s subcontractor, the victim’s employer, liable for the damage. The ruling is a pivotal step towards addressing the mental and emotional health concerns of social media content moderators, with the aim of enhancing their well-being.

Spain flag on white textured wall

In a significant legal development, a Spanish court held the subcontractor of Meta, CCC Barcelona Digital Services, responsible for the immense mental stress suffered by a content moderator under their employment. The content moderator, during his duties, was exposed to violent, hateful, and disturbing content on Facebook and Instagram, which led to his developing psychiatric illnesses.

The court ruled that work-related issues were the sole, exclusive, and unquestionable trigger, despite CCC alleging that the psychological stress was due to a ‘common illness.’ Twenty other employees have filed criminal complaints against CCC, accusing the company of gross infractions. They stated the employer knew of the potential risks of content moderation work yet took no action to set up emotional and psychological support structures for the employees. Telus, the parent company of CCC, plans to appeal this verdict.

The practice by social media companies of sourcing content moderation functions to third parties – which often employ low-paid workers and have no safeguards or support systems for them – is a common practice worldwide. While Meta has maintained that it requires all such subcontractors to make adequate provisions for counseling, training, and other worker support, along with certain technical tools to help reduce exposure, their policy does not address worker pressures due to productivity and performance quotas imposed by the employer.

Why does it matter?

This decision by the Spanish legal system sets an important precedent in recognising the immense psychological damage suffered by content moderators and the need to hold companies accountable for the well-being of their employees in this line of work. The ruling could have vast implications for the industry and increase protection and support for these workers.