Facebook Outlines its Evolving Efforts to Combat Misinformation Ahead of New Investigation

Ahead of this week’s House Energy and Commerce Committee investigation into how social platforms are tackling misinformation, Facebook has provided an overview of its evolving efforts to better protect users and reduce the distribution of false reports on its platforms.

Fake news became a key focal point during the Trump administration, with the former President regularly deriding most mainstream media reportage as being false, which, in many ways, complicated enforcement efforts. But as social platforms have become more essential sources of information for more and more people, Facebook, and others, have invested more heavily into addressing such efforts, which Facebook says has seen a significant reduction in fake reports gaining traction on its network.

But that work is always ongoing – as explained by Facebook

“It is tempting to think about misinformation as a single challenge that can be solved with a single solution. But unfortunately, that’s not the case. Thinking of it that way also misses the opportunity to address it comprehensively. Tackling misinformation actually requires addressing several challenges including fake accounts, deceptive behavior, and misleading and harmful content.”

On this front, Facebook says that:

  • It disabled more than 1.3 billion fake accounts between October and December of 2020
  • It has removed over 100 networks of coordinated inauthentic behavior (CIB) – organizations seeking to use Facebook to manipulate and mislead in order to drive users towards a politically-motivated outcome
  • It’s built teams to address the financial incentives behind misinformation, addressing a key motivation for such efforts
  • It now has over 35,000 people working on addressing misinformation, while it’s also made significant advances in AI detection

Facebook has also provided a full overview of its efforts, detailing each of its advances to better protect users.

Given the ongoing reports about the flow of misinformation on Facebook, it may seem like the problem is worsening over time, but Facebook is going to significant effort to combat various types of manipulation efforts.

A key example is the COVID-19 vaccination push:

“Since the pandemic began, we’ve used our AI systems to take down COVID-19-related material that global health experts have flagged as misinformation and then detect copies when someone tries to share them. As a result, we’ve removed more than 12 million pieces of content about COVID-19 and vaccines.”

Yet, even so, there is a lot of COVID-19 misinformation still on Facebook. Last year, research suggested that the platform had facilitated the spread of a wide range of health misinformation – and likely contributed to the COVID-19 death toll as a result. Facebook has upped its enforcement since then, but the capacity to reach billions of people, and amplify such messaging, makes it particularly dangerous for these types of campaigns.

That scale poses a range of challenges in this respect – Facebook is now operating a platform of 2.8 billion interconnected people, which is larger than any other company has had to manage, posing many unique challenges in regards to tracking information, monitoring trends, and facilitating, or restricting, information flow

That’s not to let Facebook off the hook – it has the resources to improve its efforts as a result of that usage. But still, it is worth remembering that no one has ever dealt with these issues on such a scale before, and Facebook is still developing its systems in line with this. 

But Facebook does seek to clarify one key note on this front:

“Despite all of these efforts, there are some who believe that we have a financial interest in turning a blind eye to misinformation. The opposite is true. We have every motivation to keep misinformation off of our apps and we’ve taken many steps to do so at the expense of user growth and engagement.”

Facebook says that it will act accordingly, potentially reducing engagement for the sake of the greater good in this respect. Whether that’s true, or how true that is, is hard to say – but clearly, Facebook is evolving its efforts on this front, and it is working to combat misinformation campaigns, in various ways, on its platform/s.

It remains a key distributor of various concerning movements, not just misinformation, but Facebook is working to address such, and after the Capitol riots earlier this year, it also looks set to become more proactive in tackling such trends before they take root, which could see improved outcomes moving forward.

The House Energy and Commerce Committee investigation could end up mandating more effort on this front either way, but it’s interesting to note what Facebook is already doing, and the challenges it faces, as media distribution evolves, and social media becomes more influential over time. 

You can read Facebook’s full listing of its efforts to combat misinformation, polarization and dangerous organizations here.

Like this article?

Share on Facebook
Share on Twitter
Share on Linkdin
Share on Pinterest

Leave a comment

Why You Need A Website

Now