Facebook’s Looking to Reduce Political Content – What Will That Mean for Facebook Marketing?

Last week, Facebook announced that it’s launching a new test that will see a reduction of political content in people’s news feeds.

As explained by Facebook:

“As Mark Zuckerberg mentioned on our recent earnings call, one common piece of feedback we hear is that people don’t want political content to take over their News Feed. Over the next few months, we’ll work to better understand peoples’ varied preferences for political content and test a number of approaches based on those insights. As a first step, we’ll temporarily reduce the distribution of political content in News Feed for a small percentage of people in Canada, Brazil and Indonesia this week, and the US in the coming weeks.”

Which seems like a positive step – Facebook has long been criticized for allowing potentially dangerous, politically motivated movements to thrive, which has lead to various incidents of real-world harm, the most high profile being the recent Capitol siege. That, seemingly, was the last straw for The Social Network, which is now moving to update its approach in line with evolving community expectation.

All good, right? A move in the right direction for the post-Trump era. Right?

Well, maybe – there’s a lot to consider within this shift, and there may well be impacts for your digital marketing approach.

Political Overload

First off, it’s worth considering what the motivations are for this change.

As noted, many have pointed to Facebook as a key facilitator of politically divisive content, mostly because this type of discussion drives engagement on the platform. That’s what’s repeatedly reflected in Facebook’s own stats – every day, for example, this Twitter handle, put together by The New York Times’ Kevin Roose, shares the top-performing link posts from US Facebook Pages, ranked by total interactions.

As you can see, posts from far-right Pages, like Ben Shapiro, Newsmax, and Fox News rank high, and regularly dominate these lists.

The stats don’t lie – Pages that take more partisan, divisive, argumentative viewpoints tend to see more engagement on Facebook. Because that approach prompts emotional response, and emotional response is key to provoking a reaction. Reactions lead to comments, Likes, shares, and that engagement then tells Facebook’s algorithm that this is something that people are actively interested in, which then sees such get more reach, more distribution, and thus, Facebook benefits, while also consequently distributing these perspectives.

The equation is fairly basic, yet Facebook has repeatedly argued that political content is not as significant an element as people might think.

Back in November, in response to Roose’s top ten lists, Facebook published an official response, in which it explained that:

“Most of the content people see [on Facebook], even in an election season, is not about politics. In fact, based on our analysis, political content makes up about 6% of what you see on Facebook. This includes posts from friends or from Pages (which are public profiles created by businesses, brands, celebrities, media outlets, causes and the like).”   

So political content isn’t even a big deal, according to Facebook, which plays down the platform’s role in fueling societal division.

In variance to this, Facebook shared its own listing of the Pages that see the most reach in News Feeds on any given week.

So according to Facebook, it’s not politics, but more light-hearted, entertaining content that generates reach. Yet, even so, the company has increasingly acknowledged the potential impact of politically-motivated content, if not through its statements, but through its actions.

Facebook banned QAnon groups back in August, then stepped up its enforcement against the conspiracy movement again in October. It also implemented new rules around election misinformation, and ultimately ended up banning US President Donald Trump from its platform for his role in inciting the Capitol riots.

Clearly, whether due to public pressure or internal realization, Facebook has been taking action against politically-motivated content. And now, it’s looking to extend that further – but is that because of a change of heart, or a change in user behavior?

The latest figures from Facebook suggest that there are some problems on the latter front, with the platform’s daily active usage declining in the US.

Facebook Q4 2020 - DAU

So people are coming to Facebook less often than they were before, at a time when, more broadly, people are increasingly relying on digital platforms to stay connected and up to date with the latest happenings. That, in itself, would be an internal concern for Zuck and Co., and as Zuckerberg himself has acknowledged:

“One of the top pieces of feedback we’re hearing from our community right now is that people don’t want politics and fighting to take over their experience on our services.”

That could suggest that Facebook is seeing a user downturn as a result of the rising amount of politically-motivated discourse, which may be why it’s now decided that the additional engagement benefits are no longer worth the blowback.

It’s a good PR move, and if it helps Facebook retain users, then it makes sense for the platform to reduce political content.

But will it actually have any impact?

Everything’s Political

An interesting consideration here is that it’s not so easy to define what ‘political content’ actually is.

OneZero’s Will Oremus took a more in-depth look at this recently, even interviewing Facebook representatives about the complications of this process, who admitted that there are significant complexities at play.

As per Oremus:

What is political content, exactly? How does Facebook define “political?” A group called “Biden for President” pretty clearly qualifies. But what about a Black Lives Matter group? Or a post about the MeToo movement? If I write a post criticizing mask mandates as an infringement on my liberty, is that political? What about if I write a post urging others to abide by mask mandates? Will that be shown to fewer people now? Perhaps more to the point, how will Facebook’s algorithm know whether my post is political or not?”

Facebook, of course, doesn’t have all the answers to these questions as yet, but the new process will seemingly look to reduce ‘hyperpartisan outlets’ in favor of more reputable, authoritative news providers.

That’s what Facebook did in the wake of the US election – in an effort to temper rising community angst, which, at that time, seemed like it could indeed lead to civil unrest, Facebook deliberately reduced the reach of more divisive news outlets on the platform.

As per The New York Times:

The change resulted in an increase in Facebook traffic for mainstream news publishers including CNN, NPR and The New York Times, while partisan sites like Breitbart and Occupy Democrats saw their numbers fall.”

This lead to what Facebook staffers internally referred to as the ‘nicer’ News Feed, reducing the intensity of debate and division across the board, while also keeping people who rely on the platform for news adequately informed.

According to reports, several staffers actually asked if they could keep the nicer feed beyond the post-election period. Now, that appears to be exactly what Facebook’s doing – but as noted by Oremus, the actual definitions here will matter, and Facebook’s processes for detecting what should see reduced reach as a result will likely take some time to evolve and shake out, which will have varying impacts.

A New News Feed

As this happens that will also, obviously, have an impact for marketers. With less political content in feeds, that means there’ll be more room for other content – and going on Facebook’s lists of the content that generates the most engagement outside of politics, that will see a lot more entertainment-focused, light-hearted posts getting more reach.

As noted by Conviva’s Nick Cicero in a recent interview with Digiday:

Brands want to be next to feel-good content for a change, and with Facebook changing its algorithm to not promote political content, community-driven, lifestyle content is going to have a moment of growth.”

Indeed, the change could see Facebook re-shifting focus onto several key elements, including lifestyle/entertainment posts, but also non-political groups (Facebook stopped promoting political groups back in November), and eCommerce.

On-platform shopping has become a bigger focus for The Social Network, especially given the rise of eCommerce amid the pandemic. The gradual expansion of Facebook Shops will make more and more posts on the platform ‘shoppable’, and the reduction of political content could pave the way for a new push towards increased product discovery and purchase activity in a broader change of user behaviors.

But it’s lifestyle and entertainment content that’s likely to be the biggest beneficiary. If Facebook does indeed make a significant push to squeeze out political updates, it will be looking to replace that engagement with more light-weight posts, and previous research by Buffer shows that inspirational, funny, and/or practical posts see the most engagement on the platform, outside of politics.

That could pave the way for new opportunities for brand promotion and engagement. The key will still lie in emotional response, and sparking an intense reaction in the viewer, which prompts them to Like and share. But it may well be that you’ll have new, expanded opportunities to generate more Facebook reach by focusing on these types of joyful, connective updates, as it looks to shift away from division.

There’s a lot to come yet, and a lot of testing and experimentation to be conducted before we establish what this change actually means for Facebook, and for digital marketing.

And that also applies in a more general sense – will this change actually be good for society, and Facebook’s impact more broadly?

As Oremus notes:

What most of us really want from Facebook and other platforms, I suspect, is not “less politics” but less hate speech, less misinformation, less algorithmic bias toward shock and outrage and tribalism — in short, less of a distortionary effect on politics.”

Indeed, reducing the algorithmic incentives behind divisive, partisan posting is key – but if your business strategy is focused on maximizing user engagement, that’s also proven critical to boosting discussion.

Where Facebook draws the line on this, and how it does so, will be key, and could help to pave a better way forward for all platforms, if it can get it right.

Like this article?

Share on Facebook
Share on Twitter
Share on Linkdin
Share on Pinterest

Leave a comment

Why You Need A Website

Now