Outsourcing firm Sama regrets moderating disturbing Facebook content in East Africa

CEO Wendy Gonzalez acknowledged the company’s misjudgment and announced a shift away from tasks involving moderating harmful content.

Meta's logo Electronics, Mobile Phone, Phone, Person, Face, Head

Sama, an outsourcing company contracted to moderate Facebook posts in East Africa, has expressed regret for its involvement in content moderation after former employees reported being traumatized by exposure to graphic and distressing material.

The former Kenya-based workers have taken legal action against the firm, asserting they suffered emotional distress from videos depicting beheadings, suicide, and other disturbing content.

Sama’s CEO, Wendy Gonzalez, acknowledged the mistake and stated that the company would no longer engage in tasks related to moderating harmful content. She revealed that this type of work, which constituted a small portion (4%) of the company’s business, would not be accepted again due to its negative impact on their core operations. She acknowledged “lessons learned” and stated that Sama now has a policy against accepting tasks involving harmful content moderation. Additionally, the company will refrain from AI-related projects that support weapons of mass destruction or police surveillance.

Why does it matter?

The case spurs a closer look at industry practices and support for content moderators. Notably, the outsourcing of moderation to regions with lower labor costs comes into focus. For instance, OpenAI’s use of outsourced Kenyan laborers through Sama earning less than $2 per hour was exposed earlier this year. This revelation has prompted discussions about the potential negative impacts of content moderation and the need for better safeguards in such roles.