German social media moderators rally for improved working conditions
Moderators argue that social media companies rely on their work for platform safety while neglecting their well-being.
Lawmakers in Germany are being urged by a large group of social media moderators to enhance their working conditions. These moderators, responsible for removing harmful content from platforms like Facebook and TikTok, have expressed concerns regarding demanding targets and mental health challenges. Cengiz Haksöz, employed as a content moderator at TELUS International, is scheduled to speak before the Bundestag’s Digital Council and share his personal ordeal of feeling ‘mentally and emotionally exhausted’ due to the screening of harmful material.
During his appearance, Haksöz will submit a petition signed by more than 300 content moderators, advocating for improved legal safeguards. The petition calls for better access to mental health services, a non-disclosure agreement (NDAs) prohibition, and enhanced compensation and benefits. The moderators assert that social media companies heavily rely on their efforts to ensure platform safety yet neglect their overall well-being.
Meta (previously known as Facebook) has faced criticism for the working conditions of its content moderators, having previously settled a $52 million lawsuit with American moderators who experienced long-term mental health issues. More recently, the Kenyan court decreed that Meta is responsible for the mental health care of all content moderators in its employ.