Facebook's continuing to expand its efforts to restrict the reach of spam and misleading content, announcing yet another News Feed algorithm change, this time solely focused on a practice called 'cloaking'.
Cloaking is a more advanced spammer tactic - the process uses re-directs, based on the viewers' IP or other identifier, in order to avoid detection.
As explained by Facebook:
"For example, they'll set up web pages so that when a Facebook reviewer clicks a link to check whether it's consistent with our policies, they're taken to a different web page than when someone using the Facebook app clicks that same link."
As noted, the process is more advanced than the normal 'bait and switch' type News Feed deception - which Facebook also moved to stamp out recently by taking away the ability to edit link previews on posts.
Original link post on the left, edited link preview on the right
By taking away the capacity to change the link, they reduce spammers' ability to easily re-direct users to their scam sites - Facebook hasn't said how common this type of link-baiting is (or was), but it was prominent enough for them to enact a new rule to eliminate it.
Cloaking is another example of the ever-evolving cat and mouse game of spammers and web platforms. For every rule the providers introduce, every update, spammers work to find loopholes to exploit. It makes sense, of course, they make money by doing so, but it can also lessen the user experience - which is bad news for digital marketers overall, as it makes users more suspicious of all web links.
It's the latest in Facebook's wider efforts to eliminate spam and misinformation, which has been ramped up significantly since the 2016 US Presidential Election - in which Facebook was believed to have played a significant role in shaping public opinion by exacerbating division among voters. The News Feed system is built on showing people more of what they want to see, which also means it can be used to reinforce opinion - this was best demonstrated by The Wall Street Journal's 'Red Feed, Blue Feed' experiment, which showed how people leaning to either side of the political spectrum were shown very different stories in their Facebook feeds.
Cloaking is not, essentially, part of this process, but with Facebook looking to eliminate the financial motivation for those misusing its feed, all of these types of practices are coming under scrutiny.
And financial motivations definitely were behind at least a proportion of the political propaganda, with groups of teens in Macedonia, in particular, admitting to creating pro-Trump websites to spread fake news. As such, Facebook is broadening its net to weed out these bad actors.
In order to crack down on cloaking, Facebook says it will be using artificial intelligence and expanded human review processes to 'better observe differences in the type of content served to people using our apps compared to our own internal systems'.
Facebook says that Pages found to be using cloaking will be removed entirely, as the process is 'deliberate and deceptive'.
As explained by Facebook ads product director Rob Leathern (to TechCrunch):
"There's no legitimate use case for cloaking. If we find it, it doesn't really matter who that actor is. They're usually bad actors and spammers by definition. So the line is if anyone does this in any way, shape, or form, we want them off the platform."
Thus far, Facebook says they've already taken down 'thousands' of offenders through their new detection methods.
So what's the potential impact for you and your Page? If you don't use cloaking, nothing. Depending on how Facebook detects such processes, it's theoretically possible that Facebook could mis-attribute similar re-direct methods as cloaking, though their systems are obviously fairly advanced - once Facebook knows what it's focusing on, they're pretty good at zoning in on exactly that. And the chances of you using similar re-directs for any other purpose are fairly slim.
Basically, if you don't use cloaking, the change is good, as it can only increase consumer confidence in clicking on Facebook links - links to real Pages and content like that which you and your company produce. Really, that's the key benefit of all of Facebook's wider moves on this front - for every spammer removed, or every spam trick detected, Facebook helps reinforce confidence in their system, which helps users feel more secure when checking other sites.