New Reports Underline Facebook’s Role in Exacerbating Political Divides – But Will Facebook Take Action?

As part of the ongoing investigation into potential antitrust activity among US tech giants, the CEOs of Google, Facebook, Amazon and Apple all recently appeared before the US House Judiciary Committee, where they were asked a range of questions relating to varying concerns about how their companies operate.

And one of those queries was particularly pertinent as we head into the 2020 US Election period – in his allotted time, Rep. David Cicilline posed this statement to Facebook CEO Mark Zuckerberg:

Facebook is profiting off and amplifying disinformation that harms others, because it’s profitable. This isn’t a speech issue, it’s about Facebook’s business model that prioritizes engagement in order to keep people on Facebook’s platforms to serve them more advertisements.”

Is that true? Does Facebook profit from division, and the engagement that it generates?

In his response, Zuckerberg, predictably, said that this is incorrect, and that Facebook shows people “what’s going to be the most meaningful to them”, not just what’s going to generate the most engagement. Cicilline then followed up with an example of a recent COVID-19 conspiracy video that racked up 20 million views in 5 hours on Facebook.

“A lot of people shared that,’ Zuckerberg responded.

So does that mean that it was ‘meaningful’ to them? Is Zuckerberg being disingenuous by side-stepping the specifics of the query?

This is one of the key elements of debate in the broadening political divide – what role does social media, and Facebook specifically, have in exacerbating existing societal division? There’s always been a level of debate from both sides of the political spectrum, but it does seem to have become more pronounced, and more influential in recent times, and the biggest change within that period is the number of people now getting their news content from social platforms.

And the logic behind the concern makes sense – Facebook’s algorithm does prioritize engagement. If you post something that generates a lot of shares and discussion, that will ensure your subsequent posts get more reach, as Facebook’s system looks to get more people involved in such exchanges in order to keep them on platform for longer.

That’s altered the incentives for news publishers with respect to how they publish. An opinion piece titled ‘COVID-19 is a hoax, here’s the real truth’ will generate more response than one with the headline ‘Scientists have been studying COVID-19 for years’. Both articles could have the same content, but one headline is more salacious and plays to people’s inherent desire believe that they’re being mistreated by the government.

Again, that approach has always been effective – just take a look at the celebrity gossip magazines and the way that they rope readers in with unfounded rumors and tales. But Facebook has turned many more publishers into gossip machines, while also giving this type of material infinitely more reach, and therefore influence.

Facebook does know this. As much as Zuckerberg might try to pay it off and pretend that he’s unaware, or that it’s up to the people to decide what they want, and that Facebook has no real part to play, other than being the platform to host such discussion. Facebook does know that its algorithms exacerbate division – they’ve admitted so much.

Earlier this year, Facebook executive Andrew Bosworth, a former head of the company’s mobile ads department, released an internal memo in which he shared his thoughts on various controversies around how Facebook works, its role in influencing elections, data-sharing, etc.

Among his notes, Bosworth dismissed the idea of filter bubbles which are facilitated by the Facebook algorithm, and theoretically see Facebook users shown more content that they’re going to agree with, and less of what they won’t.

Bosworth said that Facebook’s system actually ensures that users are exposed to significantly more content sources than they would have seen in times before the internet.

Ask yourself how many newspapers and news programs people read/watched before the internet. If you guessed “one and one” on average you are right, and if you guessed those were ideologically aligned with them you are right again. The internet exposes them to far more content from other sources (26% more on Facebook, according to our research).”

Facebook’s COO Sheryl Sandberg quoted this same research in October last year, noting more specifically that 26% of the news which Facebook users see in their represents “another point of view.”

So, that’s better right? Facebook actually ensures that people see more perspectives, so it can’t be held responsible for reinforcing political division. Right?

Not exactly – again, from Bosworth:

The focus on filter bubbles causes people to miss the real disaster which is polarization. What happens when you see 26% more content from people you don’t agree with? Does it help you empathize with them as everyone has been suggesting? Nope. It makes you dislike them even more. This is also easy to prove with a thought experiment: whatever your political leaning, think of a publication from the other side that you despise. When you read an article from that outlet, perhaps shared by an uncle or nephew, does it make you rethink your values? Or does it make you retreat further into the conviction of your own correctness? If you answered the former, congratulations you are a better person than I am. Every time I read something from Breitbart I get 10% more liberal.”

So, Bosworth effectively acknowledges that, yes, Facebook’s News Feed algorithm amplifies division. Not in the way many think, in narrowing their perspective by only showing them content they agree with, but actually the opposite – by showing users more content from more sources, Facebook pushes them further to either side of the political divide.

Again, this is a Facebook executive acknowledging this, Facebook is aware of such. Zuckerberg may deny it, but he knows as well as Bosworth does, as this is a discussion that’s been held among Facebook’s leaders.

And that only looks set to get worse as more people become more reliant on Facebook for news content. Thousands of local publishers are being forced to shut down, with the COVID-19 pandemic being the final nail in the coffin for their businesses. And when locals can’t get local news from a trusted outlet anymore, where do you think they’ll turn?

Facebook then further reinforces such division by hosting extremists within private groups, where their discussion is out of the public eye, and therefore beyond expanded scrutiny. That’s been underlined again this week with the leak of internal documents which show that Facebook currently hosts thousands of groups and pages, with millions of members and followers, which support the QAnon conspiracy theory.

QAnon has gained a fervent following online for purportedly sharing secret insights about the Trump adminstration’s ongoing battle against the ‘deep state’, a collection of elite business people and celebrities who are secretly controlling the world.

Twitter announced a crackdown on QAnon-linked accounts last month, and Facebook is now also reportedly weighing its options. But the sheer scope of the group‘s reach on Facebook is staggering, and underlines the role that Facebook can play in amplifying extreme views and fringe theories.

It’s not hard to see how the combination of divisive content amplification, and the hosting of groups sympathetic to either side, at Facebook’s scale, could be a major problem. And again, Facebook will deny this, it’ll play down its role, it’ll claim it knew little about its groups being used for such activity. But this is not new information – various reports have highlighted the same concerns for years

And in amongst this, you have claims of Facebook removing fact-checks in order to avoid conflicts with political leaders, allowing hate speech to remain up on its platform because it’s ‘in the public interest‘, and allowing politicians free reign to outright lie in their ads so that the people can decide.

On balance, when assessing the various factors, it’s hard not to conclude that Facebook would prefer to leave such content alone, as it does indeed spark more engagement. Does Facebook benefit from that? Yes it does. If Facebook werer forced to take a tougher stance on controversial posts and opinions, would it cost Facebook money, and likely see it lose engagement? Yes it would.

In essence, Facebook wants all the benefits of being the most used social platform in the world, but none of the responsibility. But at 2.7 billion users, it’s influence is simply too great for it to take a hands-off approach – that responsibility is huge, and there should be consequences for failing in this respect.

Mark Zuckerberg would prefer to focus on the positive, on the idealistic view of social media being a connective tool that unites the world for good. But that’s not what’s happening, and the company needs to re-align with how it’s actually being used, in order to mitigate the dangerous trends permeating from its dark corners. 

If it doesn’t, expect the political division to worsen over time.

Like this article?

Share on Facebook
Share on Twitter
Share on Linkdin
Share on Pinterest

Leave a comment

Why You Need A Website

Now