Facebook’s content moderators are fighting back

When people are signed off sick due to mental health issues, other team members feel like they have to pick up the slack, she explained. Content moderators have an hour and a half allocated each week to talk to these wellness coaches. But this only provides them with soft counselling, not medical care or psychiatric support, MPs were told.
“My mother was quite sick, I was uncomfortable being in the office, so this created great anxiety for me,” Plunkett tells me. “I then obviously had issues with the content. I said, ‘I don’t think it is really fair, I’m the only person [in my household] leaving my home to go out during the peak of Covid, I don’t want to be bringing this back to my mother’. I was carrying that stress and burden every day.”
While at work she Plunkett given a queue of content to categorise: sometimes it’s violent or explicit videos or images, sometimes it’s simply a sentence of text. “If I do have bad content and I need to step away from the screen to go for a walk or go for a smoke or whatever, I can’t,” she says. Unless it’s in her allocated “wellness” hour and a half each week, team leaders must decide whether moderators are allowed to walk away from content.
She deals with an average of 150 pieces of content a day. “I like to take my time, and get it right,” Plunkett says. But if a Facebook employee disagrees with her decision, or she takes a second longer than her allocated break at any given month, she says she will slide down the tier scale of moderators from Tier A to B or C and be paid less.
‘’We have 30 minutes for lunch. I’d take 28 minutes, and I’d set a timer for my breaks on my phone so I wasn’t going over,” she explains. “It’s that fear of not wanting to be unavailable, not wanting to be potentially penalised on my bonus for going a second over.”
If moderators use up their allocated time early in a given week, Plunkett claims, they will not be able to access any support for the rest of the time, no matter what they see. They are also told they need to have a “quality score” of over 85 per cent on the content that they review. To do this they have to view content several times, and often share it between the team – even if it is upsetting content – to determine how to classify it correctly and hit their targets. Facebook claims there are no time limits to breaks. The company also says it is employing “technical solutions” to limit how much content moderators are exposed to potentially graphic material “as much as possible”. Facebook also says it provides psychological training and support to those who watch graphic content. Covalen did not respond to a request for comment.
“Facebook is very proud of the fact that Facebook employees get proper psychological support,” Dark says. “But outsourced moderators who do exactly the same job, and look at exactly the same content, have wellness once a week, which is not with a trained psychologist. It is not proper, meaningful, clinical long term mental health support in the way that they get at Facebook.”

Plunkett’s testimony follows that of two other Facebook moderators in Dublin, who met Leo Varadkar, Ireland’s minister for enterprise, trade and employment, earlier this year to raise their concerns. They echo the working conditions that other moderators around the world have spoken out about before.
Moderators and the union that represents them hope they can use this momentum to push through changes in employment law to improve the rights of people viewing sensitive content online, and give them the same rights as Facebook employees. And, if this does happen, it would set an important precedent.

Like this article?

Share on facebook
Share on Facebook
Share on twitter
Share on Twitter
Share on linkedin
Share on Linkdin
Share on pinterest
Share on Pinterest

Leave a comment

Why You Need A Website

Now