Facebook has shared a new report on its efforts to stamp out inauthentic behavior which doesn’t quite meet the threshold of ‘coordinated inauthentic behavior’, which are generally large-scale political manipulation efforts, often run by government-aligned organizations.
As explained by Facebook:
“Unlike CIB, which is typically designed to mislead people about who is behind an operation in order to manipulate public debate for a strategic goal, IB is primarily centered around amplifying and increasing the distribution of content and is often financially motivated.”
This is the lower tier stuff – it’s basically the dodgy operators who try to trick people into clicking through to ad-filled junk sites by posting topical updates and trending posts.
Stuff like this – why anyone’s getting news updates from a Facebook Page named ‘We need 1Million Trumpers to Make America Great Again’ is a mystery in and of itself, but evidently, cognitive bias will allow people to share anything that fits their world view, regardless of how nonsensical the actual headline, or source, might be.
The idea of these links is to get people to click through, under the guise of the headline, when really, they’re just headed to a site filled with ads, products, whatever else.
Facebook says there are three common uses of this form of deception:
- Inauthentic distribution – This is the most common form of this lower tier of inauthentic content, and basically involves people or businesses creating fake Facebook accounts, Groups and Pages in order to distribute links back to certain Pages.
- Abusive audience building – Some use this tactic to build audiences based on trending topics – the process generally involves switching your Page or group name to align with whatever’s trending in order to gather more followers/members. Once you have enough members, you can then spam them with links, etc. This is why Facebook added the Page History detail in the information tab back in 2018.
- False pretenses – Scammers will also post Facebook updates suggesting that the Page is supporting a cause, or is part of the same community as their target audience, in order to lure clicks.
So it’s all the common forms you’ve likely come across in your own Facebook usage, and have avoided because you have some idea of what to look for. But not everybody is as digitally literate, and sometimes, these scams work, and end up ruining people’s Facebook experience.
Facebook provides some key examples of these scams in action, the most high profile of which being Natural News, which at one stage had more than three million followers on the platform.
Natural News was a notorious proliferator of conspiracy-related content, which Facebook eventually shut down in June 2019.
Facebook says that it removed 15 Pages, and blocked links to at least 850 domains, associated with Natural News.
“The people behind this activity engaged in repeated and egregious violations of our inauthentic behavior policy. The US business behind these Pages relied on content farms in Macedonia and the Philippines, misled people about the origin and popularity of its content, inauthentically amplified its posts with fake accounts and engaged in deceptive tactics to evade our IB enforcement.”
Facebook says that Natural News and its CEO are banned from the platform, and it continues to monitor attempts from this network to come back.
Facebook provides additional examples of similar process, but the moral of the story here is that Facebook is onto these scams, it knows what these operators are doing. And it’s getting better at detecting various forms of audience manipulation for clicks.
Just like with fake engagement sellers, Facebook’s cracking down, and while it might seem like a reasonable idea to pay some chumps to boost your Page engagement, especially when you’re starting out, just don’t.
I assume the types of scammers who use these services don’t read SMT, but if it’s ever crossed your mind, it’s not the way to go – and the consequences, given Facebook’s improved detection, could be significant.
You can read Facebook’s full Inauthentic Behavior update here.