Former Facebook moderators sound alarm over treatment of workers ahead of US election


Alison Trebacz, a former Facebook content moderator based in Arizona, remembers the day of the 2017 Las Vegas mass shooting, which killed 58 people and injured more than 800 others, almost as if she were there.

She came into work that morning expecting to see graphic content, but nothing could have prepared her for the queues full of videos of dead and dying victims waiting for her when she arrived.

Sign up for the Guardian Today US newsletter.

Trebacz, who worked at Facebook from 2017 to 2018, was paid just $15 an hour, or around $30,000 a year before taxes, to watch footage such as this – thousands of hours of graphic content, including suicides and injuries.

After six months on the job, family and friends noticed she had started to become intensely cynical, a stark shift from the positive and upbeat person she once was. At bars with friends she would frequently burst out crying, seemingly without warning. Most days she would go to bed at 6pm after getting home from work.

“I was really starting to lose myself, just seeing the worst of humanity every day,” she said.

Trebacz is one of two former moderators who came forward with claims on Monday that Facebook underpays and mistreats these contract workers.

Such allegations are not new: for years moderators, who Facebook contracts through third parties and are not considered staff, have called on the company to change the way it treats them. In May 2020, Facebook agreed to pay $52m in a settlement with moderators who claimed the company did not do enough to protect them from the mental health impacts of the job.

READ  World Health Organisation warns coronavirus kills 0.6% of patients & that there ‘may never be silver bullet’ to kill bug

But the calls are being renewed ahead of the 2020 elections, which will place moderators on the frontlines of an integral moment in American democracy. Moderators are responsible for policing hate speech and calls to commit violence – such as a Wisconsin militia group’s “call to arms” that led to two deaths in Kenosha this summer – which may be particularly consequential around the elections.

“We’re just over a week out from the most important US election in generations, and Facebook still won’t get to grips with lies and racism run amok on its platform,” said Cori Crider, co-founder of UK-based tech justice group Foxglove, who organized the event with moderators and the Real Facebook Oversight Board.

The former workers and civil rights groups backing them, including Color of Change, are calling on the company to convert moderators into employees, give them the same rights as other Facebook staff, and train and support them in their crucial role protecting democracy. Facebook is increasingly relying on artificial intelligence to detect objectionable content, but it is not advanced enough yet to crack down on more nuanced issues like racism or offensive humor.

“Human moderators are not going to go away – they are only becoming increasingly important,” Trebacz said. “If Facebook wants valuable feedback from the people doing the bulk of the work, they would benefit by bringing them in house.”

A spokesperson for Facebook told the Guardian all content reviewers at Facebook go through “an in-depth training program” on its content policies and are provided access to psychological support, including a 24-hour on-call service. The company added it relies on “technical solutions” such as artificial intelligence to limit exposure to graphic material as much as possible.

READ  Essex dad's pollution superhero tale for life-limited son

“We’re incredibly grateful to our reviewers for the work they do keeping our platform safe,” the spokesperson said.

Ahead of the elections, Facebook has adopted measures to crack down on violence and misinformation, including banning the conspiracy theory movement Qanon and “militarized social movements” including pages and content that encourage violence.

Trebacz saidmental health care in the office was severely lacking during her time at Facebook, which ended before these measures were introduced. There was just one counselor in the office with more than 80 employees, and strict NDAs prevented workers from discussing the nature of the job with even their partners or family. Facebook said the NDAs are meant to protect workers from potential retaliation for content moderation and to protect user information, and that workers can discuss some parts of their jobs with family members if they don’t go into specifics.

But Trebacz said many moderators “end up bottling it all up inside and being trauma bonded to all of your co-workers,” Trebacz said.

She avoided seeing a psychiatrist because she was afraid she couldn’t afford it. Converting the roles to full-time jobs that provide benefit packages would help access to this, she said.

Facebook has agreed to pay $52m in a settlement with moderators who claimed it didn’t do enough to protect them from the mental health impacts of the job.



Facebook has agreed to pay $52m in a settlement with moderators who claimed it didn’t do enough to protect them from the mental health impacts of the job. Photograph: llana Panich-Linsman/Getty Images

Trebacz said that, in her experience, the average person worked as a moderator for just six months before leaving. More than 11,000 people have worked as moderators from 2015 to 2020, according to a lawsuit settled in May.

“People just can’t keep doing that for such a low wage,” she said.

In addition to low pay, these workers don’t receive the same benefits as company employees, don’t have unemployment insurance, sick leave or collective bargaining abilities, said Jade Ogunnaike, the senior campaign director of Color of Change.

“When companies like Facebook make these grand statements about Black Lives Matter, and that they care about equity and justice, it is in direct contrast to the way that these content moderators and contractors are treated,” she said. “If Facebook wants to be viewed favorably, it’s going to have to start at home and treat its employees well.”

This was the case for Viana Ferguson, who worked at Facebook from August 2016 to February 2019 as an assistant manager on a content moderation team. She quit the job after taking a six-month leave of absence to address depression and anxiety that had spiraled out of control during her time there.

“Facebook’s comments around racial justice definitely seem like lip service,” she said.

For her, the mental toll of the job was compounded by management’s refusal to take her experiences as a black woman seriously. She said on multiple occasions she would be forced to explain to white management above her why a certain meme or image was racist, only to have her input ignored.

“I felt it wasn’t encouraged for me to express my perspective,” she said. “It seemed like it was my job to align and assimilate. It was like, do you want me to be doing a job or do you want me to be a robot?”

Ferguson was hired early in Facebook’s process of expanding its content moderation team, which has grown substantially in subsequent years. Like Trebacz, she reviewed hundreds of videos an hour and decided whether to place a warning on the content, remove the content, or keep it online and do nothing. She was there for such major news events as the Las Vegas shooting, the Parkland shooting in Florida and the 2016 elections.

She said she saw the amount of hate speech increase exponentially after Donald Trump was elected. She believes users were emboldened by the president, who has posted content she believes violates the platform’s rules. Ferguson said while the content was difficult to experience, what was more frustrating for her was the way the job was managed. Moderators were often given little direction in how to deal with substantial news events, and risked punishment or even firing if they removed a post that was later deemed not to have violated Facebook rules.

By the end of her time as a contractor, Ferguson said she was crying daily in front of her colleagues, unable to do her job. She said if Facebook allowed moderators a larger hand in how content is managed, and paid them fairly, the job would be more sustainable.

“Facebook needs to take more time to listen to content moderators on the frontline because they are the ones who are in it,” she said. “It can’t be a top-down approach when it’s a community effort. This is about community – about people, not a product.”



READ SOURCE

LEAVE A REPLY

Please enter your comment!
Please enter your name here