This is big. Today, Facebook has announced that it will launch a new test that will see a reduction of political content in people’s news feeds, in response to concerns about the impacts of divisive political debates on the platform.
As explained by Facebook:
“As Mark Zuckerberg mentioned on our recent earnings call, one common piece of feedback we hear is that people don’t want political content to take over their News Feed. Over the next few months, we’ll work to better understand peoples’ varied preferences for political content and test a number of approaches based on those insights. As a first step, we’ll temporarily reduce the distribution of political content in News Feed for a small percentage of people in Canada, Brazil and Indonesia this week, and the US in the coming weeks.”
Indeed, in Facebook’s recent earnings call, Zuckerberg noted, specifically, that:
“One of the top pieces of feedback we’re hearing from our community right now is that people don’t want politics and fighting to take over their experience on our services. So one theme for this year is that we’re going to continue to focus on helping millions more people participate in healthy communities and we’re going to focus even more on being a force for bringing people closer together.”
Aside from simply reducing the amount of political content in user feeds, Facebook says that it will also test a range of different ways to rank political content, with a view to improving user engagement and experience.
“To determine how effective these new approaches are, we’ll survey people about their experience during these tests. It’s important to note that we’re not removing political content from Facebook altogether. Our goal is to preserve the ability for people to find and interact with political content on Facebook, while respecting each person’s appetite for it at the top of their News Feed.”
The change in approach is a significant step for Facebook, which has seemingly allowed, and even incentivized, divisive political content thus far because of the amount of engagement it generates.
Facebook’s infamous News Feed algorithm rewards interaction and response, which means that more divisive, argumentative content can effectively see more reach, because it sparks emotional response and gets users to post more replies, submit Reactions, share more content, etc.
That’s fairly clearly reflected in the most-shared link posts on the platform each day, a list of which is posted on this Twitter account.
The top-performing link posts by U.S. Facebook pages in the last 24 hours are from:
1. Donald Trump For President
2. Ben Shapiro
3. Fox News
4. Hillary Clinton
5. Ben Shapiro
6. ForAmerica
7. NPR
8. Bernie Sanders
9. Newsmax
10. USA Patriots for Donald Trump— Facebook’s Top 10 (@FacebooksTop10) February 10, 2021
Partisan political outlets always dominate these lists, which underlines the value of confronting, argumentative content on the platform – but if Facebook sees such usage benefit from this content, why would it consider reducing it? Is it purely a public good measure?
Well, probably not.
As per Facebook’s latest results, the platform’s daily active user count is actually flat-lining in the US.
That could be an indicator of what Zuckerberg’s now saying – Facebook users have had enough of the political debates on the platform, which could have reached the point where it’s now be turning people away from the app.
Facebook wants active engagement, which political content can bring, but not at the expense of overall usage. If that balance is shifting, and more people are now logging onto Facebook less because of the incessant political debates, maybe that’s why Facebook has decided that the time has come to de-emphasize those posts.
And Facebook has actually already tried this approach.
In the days following the 2020 US election, and amid rising political tensions, Facebook deliberately reduced the reach of more partisan, divisive news outlets on the platform, in favor of more reputable providers, in order to ensure improved balance in political news coverage, with a view to quelling community angst. This lead to what Facebook staffers internally referred to as the ‘nicer’ News Feed, reducing the intensity of debate and division across the board, while also keeping people who rely on the platform for news adequately informed.
According to reports, several staffers actually asked if they could keep the nicer feed beyond the post-election period.
Now, that appears to be exactly what Facebook’s doing.
“During these initial tests we’ll explore a variety of ways to rank political content in people’s feeds using different signals, and then decide on the approaches we’ll use going forward. COVID-19 information from authoritative health organizations like the CDC and WHO, as well as national and regional health agencies and services from affected countries, will be exempt from these tests.”
So, the ‘nicer’ News Feed now looks to be on its way, which, as noted, could be a major step for the platform, and for society more broadly. Again, I would be hesitant to give Facebook too much credit here, as it does seem more inspired by engagement stats than motivated by a change of heart at the company. But regardless of the ‘why’, a less divisive, less argumentative Facebook, a platform used by almost 3 billion people worldwide, can only be a good thing.
If Facebook’s tests show that more people engage more often, in more positive ways as a result, it’ll look to implement the change more broadly – which, given the impact of political partisanship over the last four years, will be the likely outcome.
And it might even reinvigorate Facebook, making it a more critical connective element.