At a Facebook deletion center in Berlin, the agents, who work for a third-party firm, remove illegal hate speech from the social network.
Gordon Welters for The New York Times
“We, the undersigned Facebook content moderators and Facebook employees, write to express our dismay at your decision to risk our lives — and the lives of our colleagues and loved ones — to maintain Facebook’s profits during the pandemic,” reads the letter, which was published Wednesday.
“After months of allowing content moderators to work from home, faced with intense pressure to keep Facebook free of hate and disinformation, you have forced us back to the office.”
The moderators go on to demand that Facebook should maximize at-home working, offer hazard pay, end outsourcing and provide “real” health-care and psychiatric care.
A Facebook spokesperson told CNBC that the company appreciates the work its content reviewers do and that it prioritizes their health and safety.
The social media giant, which is constantly battling to keep its platform free of questionable posts, photos and videos, outsources much of its content moderating to companies like Accenture and CPL.
“Before the pandemic, content moderation was easily Facebook’s most brutal job,” reads the letter, which is also addressed to Facebook Chief Operating Officer Sheryl Sandberg, Accenture CEO Julie Sweet and CPL CEO Anne Heraty. “We waded through violence and child abuse for hours on end. Moderators working on child abuse content had targets increased during the pandemic, with no additional support.”
“Now, on top of work that is psychologically toxic, holding onto the job means walking into a hot zone. In several offices, multiple COVID cases have occurred on the floor. Workers have asked Facebook leadership, and the leadership of your outsourcing firms like Accenture and CPL, to take urgent steps to protect us and value our work. You refused. We are publishing this letter because we are left with no choice.”
The Guardian reported last month that Facebook moderators at CPL were being forced to work in a Dublin office despite a high-tier lockdown, while Facebook’s own employees worked from home.
The moderators, who are paid significantly less than the average Facebook employee, claim in the letter that Facebook’s AI software can’t detect all the content that breaches the company’s policies.
“Without our work, Facebook is unusable,” the letter continues. “Its empire collapses. Your algorithms cannot spot satire. They cannot sift journalism from disinformation. They cannot respond quickly enough to self-harm or child abuse. We can.”
“Facebook needs us. It is time that you acknowledged this and valued our work. To sacrifice our health and safety for profit is immoral.”
A Facebook spokesperson said: “While we believe in having an open internal dialogue, these discussions need to be honest. The majority of these 15,000 global content reviewers have been working from home and will continue to do so for the duration of the pandemic.”
The spokesperson added: “All of them have access to health care and confidential wellbeing resources from their first day of employment, and Facebook has exceeded health guidance on keeping facilities safe for any in-office work.”
A CPL spokesperson said: “Our employees carry out extremely important work, keeping the Facebook platform safe. They are positively contributing to society in the work that they do in ensuring the safety of our online communities, and their roles are deemed essential. Due to the nature of the work, it cannot be carried out from home.”
An Accenture spokesperson said: “An important part of our culture is to encourage all our people to have a dialogue about issues that arise in the workplace and beyond. We welcome feedback from our people, and strongly support their right to express their perspectives.”
More than 25 Facebook content moderators in Dublin recently quit to take jobs with TikTok’s new trust and safety hubs.