Zuckerberg Says Recent Policy Changes Do Not Reflect a Change in Approach at Facebook

Despite recent decisions potentially indicating otherwise, it seems that Facebook has not had a change of heart with respect to hosting divisive, dangerous content on its platform.

Over the past few weeks, Facebook has announced some significant policy changes which seemed to move towards addressing long-held concerns around the platform’s role in facilitating the spread of harmful misinformation and hate speech.

  • On October 7th, Facebook announced a full ban on QAnon-related content, including Pages groups and accounts which represented the controversial conspiracy theory
  • On October 12th, the platform announced a ban on Holocaust denial content, overturning one of the more controversial stances the company had held 
  • On October 13th, Facebook implemented a new ban on anti-vax ads, expanding on previous measures to suppress anti-vax content

In combination, and coming in such quick succession, it seemed that maybe Zuck and Co. had turned a new leaf, and that maybe Facebook was actually starting to realize what many analysts had been warning about – that its platform can cause significant, real-world harm by allowing these movements to spread and reach wider audiences.

Apparently, that optimism was unfounded – as reported by BuzzFeed NewsZuckerberg recently informed Facebook employees at a company-wide meeting that the logic behind its latest policy shifts relates solely to concern about the potential for violent response in the wake of the US Presidential Election. It is not, Zuckerberg says, reflective of a broader change in approach.

As per Zuckerberg:

“Once we’re past these events, and we’ve resolved them peacefully, I wouldn’t expect that we continue to adopt a lot more policies that are restricting of a lot more content.”

According to the report, Zuckerberg sought to further reiterate that the changes would not alter Facebook’s approach, and are only relate to the election period.

Some have also suggested that the changes are in line with projections that Biden could win the Presidency, with Facebook looking to get ahead of the change by aligning its policies more around a more progressive approach.

Not so, says Zuck:

“This does not reflect a shift in our underlying philosophy or strong support of free expression. What it reflects is, in our view, an increased risk of violence and unrest, especially around the elections, and an increased risk of physical harm, especially around the time when we expect COVID vaccines to be approved over the coming months.”

So no change from Facebook – it’s still generally open to facilitating the spread of conspiracies and misinformation, so long as people do so within the overarching platform rules.

This despite reports suggesting that the platform is a key facilitator of health misinformation, a central hub enabling QAnon’s growth and a core recruitment and organizational tool for white supremacists. Facebook has heard these concerns for years, and it did seem like, maybe, finally, it was looking to take action.

According to Zuckerberg, that’s not the case.

How you view that stance will come down to your personal perspective – on one hand, Facebook doesn’t want to be the ‘arbiter of truth‘, and would prefer to let people discuss whatever they like, with the crowd essentially dictating what’s acceptable and what’s not for themselves.

Facebook has long held that it should not be the referee in such debates – and that really, it can’t be, which is why it’s in the process of implementing a new Content Oversight Board to rule on specific violations and decisions, taking the consideration out of the hands of internal staff.

Zuckerberg has maintained that he would prefer to allow free speech as much as possible, and that Facebook will only intervene when there’s a risk of imminent harm. Which, from a business perspective, makes sense – the more speech Facebook allows, the more discussion it can host, the more engagement it sees, more usage, etc. Facebook would obviously prefer to have more people using its platforms more often, and setting boundaries around what’s acceptable will limit that.

From a business perspective, Facebook would prefer to let everybody have their say, regardless of what that say is. But there is a question around the qualifier of ‘imminent’ harm, and what that actually means in the context of allowing certain conversations on the platform.

QAnon is a good example – Facebook has long allowed QAnon-related discussion despite being warned about the dangers, because it saw no immediate harm connected to such. Untill recently – but many others alerted Facebook to the potential danger years ago. So ‘imminent’ in this context is relative. Another example could be climate change – Facebook still allows climate change denial content, despite clear scientific evidence showing that inaction will lead to significant harm in the long term, which Facebook itself has at last somewhat acknowledged through its own climate initiatives.

Is that ‘imminent’ harm? By most definitions, probably not, but in retrospect, when things go bad, maybe we’ll see it differently.

The line, then, comes down to how you view each element. QAnon didn’t pose an imminent risk in Facebook’s view, until it did – but has Facebook waited too long to act, with the damage caused by the group, aided by the platform, already done? Anti-vaxxers have been allowed to spread their messages on The Social Network for years, to the point that now health authorities are concerned that the effectiveness of a COVID-19 vaccine will be impeded by rising anti-vaccination sentiment. 

Now, that risk is ‘imminent’, but only because Facebook has allowed that discussion to flourish. 

So how do you decide what’s ‘imminent’, or what’s going to be an ‘imminent’ risk in future, based on current trends? Will Facebook’s Content Oversight Board better equip the platform to confront such elements?

What seems clear now is that Facebook has no plans to change its approach, which means it will continue to pose a risk. But the decision on what is and is not an imminent danger is also unclear. 

Like this article?

Share on Facebook
Share on Twitter
Share on Linkdin
Share on Pinterest

Leave a comment

Why You Need A Website

Now