Amid the COVID-19 lockdowns, Facebook has been forced to shut down its moderation centers, and send some 35,000 content reviewers home, which has significantly reduced its capacity to review posts, ads and more.
Now, The Social Network is looking to get some moderators back in operation. According to BBC News, Facebook is now re-opening some of its review offices, though staff are only being asked to return on a voluntary basis at this stage.
As per BBC:
“[Facebook] told a committee of MPs that it was now reopening some offices, with plans for social distancing and protective equipment. […] Employees will have their temperatures checked at the beginning of their shift and buildings will be deep-cleaned at the end of shifts.“
How many staff will actually be returning to work was unclear – and given some of the horror stories around the experiences of Facebook moderators, it’s hard to imagine that many of them will be quick to head back in, especially given that they’re on full pay while they stay at home.
But Facebook is bringing moderators back, in some capacity – which is important, because there’s also been an influx of concerning content on the platform of late, including COVID-19 misinformation, and arguably worse, child pornography.
Again, from BBC:
“Earlier this month, Europol said it had seen an increase in use of the internet by those seeking child abuse material.”
Facebook has been working to get more of its moderation team working remotely, but at such scale, it’s simply not possible to get back to full capacity without re-opening some centers.
Adding capacity will not only better enable Facebook to cater to ongoing demand for content review, but it may also free up more resources to implement content rulings on coronavirus misinformation, which is evolving every day.
Case in point – this week, Facebook has seen a significant increase in posts which suggest that disinfectant and UV light can be used to treat COVID-19 after US President Donald Trump made the claims during one of his daily briefings. According to The New York Times, Facebook – along with YouTube and Twitter – has left many of these posts up, despite them sharing false information, and it is possible that with increased capacity, Facebook may be better equipped to adapt its rules around such posts to respond to such trends before they gain traction.
The current limitations also extend to Facebook ad approvals, and even approvals for AR filters, which have also been delayed due to reduced capacity.
Facebook CEO Mark Zuckerberg recently outlined the company’s rough plans to get back to full capacity, noting that:
“We will require the vast majority of our employees to work from home through at least the end of May in order to create a safer environment both for our employees doing critical jobs who must be in the office and for everyone else in our local communities. A small percent of our critical employees who can’t work remotely – like content reviewers working on counter-terrorism or suicide and self-harm prevention, and engineers working on complex hardware – may be able to return sooner, but overall, we don’t expect to have everyone back in our offices for some time.”
The return of moderators will enable better operational capacity in these key areas, though the threat of COVID-19 remains, and Facebook will need to take a cautious approach, while also seeking to motivate more staff to come back into the workspace.
From an external user perspective, that’s helpful. For the moderators themselves, maybe less so.