Facebook Adds New Warning Prompts to Stop Unintended Sharing of Older News Articles

With more people relying on Facebook for news content, The Social Network has also become a key source of debate and discussion around such, and that, unfortunately, also includes the spread of misinformation and/or skewed framing of events in order to further political agendas.

Part of this comes via blatant misinformation being circulated – both on Facebook and online more broadly. But another element that Facebook has found to be problematic is that some users will re-share older news articles or posts, and frame them as new information.

That can confuse debate over current issues – and now, Facebook is looking to address this by adding in a new warning screen when people go to share a link that will alert the user if the content they’re looking to share is more than 90 days old.

As you can see in this sequence, the warning screen will alert users to the age of the article they’re looking to share, but it won’t stop them from sharing it if they choose.

As per Facebook:

“Over the past several months, our internal research found that the timeliness of an article is an important piece of context that helps people decide what to read, trust and share. News publishers in particular have expressed concerns about older stories being shared on social media as current news, which can misconstrue the state of current events.”

Several publications have sought to address this issue by themselves – last April, The Guardian began adding a new date listing on older posts when people shared them on social media.

Guardian warning

As an example in this case, The Guardian noted that it regularly saw spikes in re-sharing of one of its older articles on horse meat regulations in the UK, which it originally published in 2013. The old article had been repeatedly used to re-ignite debate around the issue, with sharers outraged that such a proposal is being considered – again.  

This is not the first time that Facebook has added warning prompts to get people to re-think their sharing behavior on the platform. 

Back in 2016, Facebook added similar pop-ups on posts which had been disputed by third-party fact checkers, prompting users to re-think their intention before they hit ‘Share’.

Facebook fact check warning

Research has found those warnings to be effective in slowing the spread of misinformation, while earlier this year, Facebook also started showing users who’d shared reports about COVID-19, that were later found to be untrue, links to official information in their News Feeds.

And Facebook may be looking to go a step further on that particular front:

“Over the next few months, we will also test other uses of notification screens. For posts with links mentioning COVID-19, we are exploring using a similar notification screen that provides information about the source of the link and directs people to the COVID-19 Information Center for authoritative health information.”

I mean, theoretically, Facebook could show the same on all types of misinformation. But then again, part of that comes down to will, and what Facebook classifies as crossing the line.

For example, this week, reports have suggested that Facebook has allowed climate change denial discussion to be made exempt from fact-checking under the guise of ‘opinion’, as opposed to regular news content. In this respect, Facebook seemingly could do more to limit the spread of misinformation, as it’s doing with COVID-19, but it needs to make a decision on what it considers ‘harmful’ in this respect. 

So while these new warning labels are good, it still comes down to Facebook’s judgment as to what types of information it will take action on.

The potential harm, or not, in this respect is a matter of personal perspective. But the question then is ‘should it be?’

Like this article?

Share on Facebook
Share on Twitter
Share on Linkdin
Share on Pinterest

Leave a comment

Why You Need A Website

Now