Social media moderators urge German lawmakers to tackle ‘exploitative’ working conditions
Cengiz Haksoz, who has worked as a content moderator at outsourcer TELUS International, is due to appear before the Bundestag’s Digital Council on Wednesday afternoon. He is expected to tell lawmakers his work screening harmful material left him “mentally and emotionally drained”.
TELUS International is a well-known provider of content moderation services for Facebook, among others.
Social media firms like Meta’s Facebook and Bytedance’s TikTok work with thousands of content moderators around the world, responsible for blocking users from seeing harmful content such as child pornography and images of extreme violence.
Haksoz is expected to deliver a petition, signed by more than 300 content moderators in Germany, calling for a new set of legal protections for those in the industry, including improved access to mental health services, a ban on non-disclosure agreements (NDAs), and improved pay and benefits.
“I was led to believe the company had appropriate mental health support in place, but it doesn’t. It’s more like coaching,” said Haksoz, speaking exclusively with Reuters ahead of his Bundestag appearance.
Discover the stories of your interest
“This is a very serious job, and it has serious consequences for workers. This job has changed me,” he said. “And these outsourcers are helping the tech giants get away from their responsibilities.” Meta has faced mounting pressure over the working conditions of content moderators keeping its platform safe. In 2020, the firm paid a $52 million settlement to American content moderators suffering long-term mental health.
“Without us, social media companies would collapse overnight,” reads the petition, seen by Reuters. “Social media can never be safe until our own workplaces are safe and fair.”
Meta and TELUS International declined to comment.
For all the latest Technology News Click Here