Want to improve your Facebook reach? Focus on more divisive, controversial content.
That, unfortunately, would appear to be the best advice based on a recent Vice investigation into how Fox News has been able to dominate news media rivals on the platform by taking a more opinionated, partisan approach to its coverage.
As per Vice's report:
"In the Trump era, Fox News has cemented itself as the most dominant news publisher on Facebook as measured by engagement, a crucial metric for Facebook’s ranking system and a rough gauge of attention on the platform, topping news organizations like The New York Times and The Washington Post. Over the past three years, CrowdTangle estimates that Fox News’ main Facebook page, with 17 million followers, has racked up 80 percent more reactions, comments, and shares than CNN, which has 31 million followers."
While driving comments and engagements makes sense, Fox News has clearly learned how to maximize this better than others, tapping into the sharing dynamics of Facebook's notorious News Feed algorithm to boost the reach of its content.
Indeed, according to an unnamed former social producer from Fox, the company made argumentative, divisive content a key strategic focus.
“We would intentionally post content that would be divisive and elicit a lot of comments,” the former social producer said, adding that high-engagement posts would inform programming decisions on TV. “Fox was all about a numbers game.”
Again, that makes sense, as these are the elements that Facebook's algorithm uses as indicators of likely popularity - a post that's sparking conversation is likely to spark even more if more users see it. But it also highlights the dangers of Facebook's platform in skewing public opinion, and fueling tribalism and divides.
One of the key concerns with Facebook's algorithm is that in showing users more of what they like, and are likely to engage with, and less of what they don't, it leads to an entirely unbalanced news diet, which further separates both sides of each argument.
This was best illustrated by The Wall Street Journals' 'Blue Feed, Red Feed' experiment, in which WSJ created two separate Facebook feeds for comparison - one which followed only Pages that were conservatively aligned, and the other which followed more liberal-aligned providers on the platform.
The variation in what each group potentially sees is stark, and it serves as a reminder of the dangers of algorithmically sorting users into such buckets - which is even more relevant now that Facebook is up to 2.38 billion users, and some 43% of US adults get at least some of their news content on the platform.
You can see, then, how Facebook could be playing a more significant role in the modern political process than you would expect - you, personally, may not think that Facebook's influence is very big, because what you're seeing in your feed hasn't changed significantly. But for many users, it has, and what they see is entirely different to what you're experiencing.
Fox News has adapted this model better than most - but while that might be better for the company's bottom line, it likely isn't so good in terms of informing the public. Of course, the counter to this is that Fox News is popular because it's telling the truth - if Facebook, for example, were to look to reduce the reach of Fox News, because its coverage is too skewed towards provoking argument, then it would be criticized for editorializing, for taking political sides.
How true you view such would be based on your own political ideology, an area that Facebook definitely does not want to get into.
It's interesting, then, to note that Facebook has this week announced the first round of researchers and academic institutions that it will work with to examine social media’s role in elections. Facebook will provide these approved groups with "privacy-protected Facebook data", to help them in their study.
As per Facebook:
"We hope this initiative will deepen public understanding of the role social media has on elections and democracy and help Facebook and other companies improve their products and practices. Over the past two years, we have made significant improvements in how we monitor for and take action against abuse on our platform. We know we can’t do this work alone, and much of the progress we have made is due to significant support from external partners, including governments, civil society groups, NGOs, other private sector companies and academics. This initiative will deepen our work with universities around the world as we continue to improve our ability to address current threats and anticipate new ones."
For one, last time Facebook opened its data up to academics, it didn't go so good. Facebook's no doubt learned lessons from this (hence the specific note of "privacy-protected Facebook data"), but it'll be interesting to see the scope of this research, and whether it extends beyond clear violations of Facebook's guidelines by politically motivated organizations, to partisan news outlets who, in essence, are only operating within the environment that Facebook itself has created.
Its Facebook's own systems that have pushed outlets like Fox News into their method of coverage, which Fox insiders essentially confirm in Vice's report.
"I was so preoccupied that our numbers were doing well that it wasn’t until the tail end of the  election that the guilt was seeping in,” the former Fox News social producer said. “We all knew that Trump was going to win.”
If that is, indeed, the case, can it be fixed? Is it possible for Facebook now roll back its processes to reduce the focus on argumentative, divisive content?