Get ready for the next wave of changes to Facebook's News Feed algorithm.
Today, Facebook has outlined how it's looking to improve its feed ranking processes by undertaking a range of new assessment approaches, including updated user surveys, soliciting different types of feedback, and weighting posts based on 'angry' reactions.
These new elements could end up having a big effect on what billions of people see in their Facebook feeds, and subsequently, how Pages optimize their approaches to maximize reach.
Here's a look at the various ways in which Facebook is looking to change its approach.
First off, Facebook's launching a new set of user surveys and response queries to get more insight into what people actually want to see in their feeds.
User surveys have long been a part of Facebook's feed ranking process - you've likely noticed prompts like this in your own Facebook Feed, seeking feedback on your experience.
Facebook says that 'tens of thousands' of these News Feed feedback surveys get filled out every day, giving the company a range of expanded insights to work with.
As explained by Facebook:
"While a post’s engagement - or how often people like it, comment on it, or share it - can be a helpful indicator that it’s interesting to people, this survey-driven approach, which largely occurs outside the immediate reaction to a post, gives a more complete picture of the types of posts people find most valuable and what kind of content detracts from their News Feed experience."
Part of its more recent approach in this respect has been questions around whether a post is 'worth your time?'
"In 2019, we introduced surveys to ask people, “Is this post worth your time?” and we use that feedback to inform how we arrange posts in their News Feed going forward. For example, if people say a post is worth their time, we’ll aim to show posts like that higher in News Feed, and if it isn’t worth their time, we’ll aim to show posts like that closer to the bottom."
This approach has provided Facebook with new insights into what users find valuable, as opposed to engaging, so it's now looking to roll out a new set of survey questions focused on what drives 'value' in user experience, as a means to enhance its algorithmic sorting.
That's an interesting approach, because what you find 'valuable' is likely a lot different to what you find 'interesting', or even what you 'like'.
Do you find posts about sports 'valuable'? What about posts from your friends?
And in a brand content context, are your Page updates what people would say is 'valuable'?
That could become a bigger consideration for your approach moving forward.
Along the same line, Facebook's also running a new series of tests to gauge what types of posts people find 'inspirational' in their feeds?
That seems less likely to provide significant value in insight - but again, it's another consideration. If Facebook is seeking to provide more valuable, inspirational content, it may be worth considering such within your Facebook posting process.
Political Content Re-Think
Facebook's also looking to get a better understanding of the broader impacts of political content on the platform, with user feedback suggesting that many people have simply had enough of the divisive, aggressive and finger-pointing political debates.
Facebook CEO Mark Zuckerberg made a specific point of this in Facebook's most recent earnings call, noting that:
"One of the top pieces of feedback we're hearing from our community right now is that people don't want politics and fighting to take over their experience on our services.”
Indeed, after a period of highly divisive politics around the world, it has, at times, felt overwhelming, with friends and family often being separated along political lines purely due to Facebook posts and debates.
Now, it seems, Facebook users have had enough, and Facebook is seeking to address this in its News Feed re-think.
"Even though your News Feed contains posts from the friends, Groups and Pages you’ve chosen to follow, we know sometimes even your closest friends and family share posts about topics that aren’t really interesting to you, or that you don’t want to see. To address this, we’ll ask people whether they want to see more or fewer posts about a certain topic, such as Cooking, Sports or Politics, and based on their collective feedback, we’ll aim to show people more content about the topics they’re more interested in, and show them fewer posts about topics they don’t want to see."
While politics is only one of three topics mentioned here, it seems likely to be the key focus. I mean, cooking and sports content might be a little annoying, but they're not likely to be as sensitive as politics-related updates.
Facebook also specifically focuses on politics with its next point of revision:
"Increasingly, we’re hearing feedback from people that they’re seeing too much content about politics and too many other kinds of posts and comments that detract from their News Feed experience. This is a sensitive area, so over the next few months, we’ll work to better understand what kinds of content are linked with these negative experiences."
In addition to direct response queries, Facebook says that it will also look to measure post response metrics, like 'angry' reactions, which will help it measure what users don't want to see.
That could be a positive step. Facebook is often blamed for causing increased societal division more broadly, due to its seemingly blind focus on maximizing user engagement, above all else.
Engagement is driven by emotional response - people comment on a post because it sparks a strong emotional reaction, they share a post for the same reason, they react because the post triggers them in some way. The most powerful emotional drivers in this respect are happiness and anger, and this is largely reflected in what you'll see within your Facebook feed.
That's also lead to more businesses changing their approach to content based on these triggers, in order to maximize reach and clicks. It's arguable, for example, that news outlets like Fox News have been emboldened, or even fueled by online algorithms like those powering Facebook's News Feed, which essentially incentivize partisan, divisive content approaches, in order to spark audience reactions that will then see their content get more comments, more shares - and thus, drive more clicks through to their websites.
In the wake of the Trump presidency, and more specifically, the Capitol riots, it seems that Facebook is now taking a harder look at this element, and this new approach, assessing the impacts of political content, could lead to a reduction in negative experiences on the platform, reducing divisive content in user feeds.
More Direct Response
Finally, Facebook's also testing a new option to make it easier for people to hide posts they're not interested in within their feed, which will then inform its algorithms about what you don't want to see.
"If you come across something that you find irrelevant, problematic or irritating, you can tap the X in the upper right corner of the post to hide it from your News Feed and see fewer posts like it in the future."
That could be an easy way to gather more direct user response, while it could also form new habits that will help users better control their feeds, with an understanding that whatever you tap that X on will inform the algorithm of your preferences.
As with every Facebook News Feed update, there are significant implications here, with each of these changes potentially leading to new distribution shifts that could alter what people see - and how Page managers approach their Facebook strategy.
The focus of note, based on these explanations, is:
- More 'valuable' content
- More 'inspirational' posts
- Less divisive political stances
The feedback Facebook gets from these new tests could change these elements, but it seems like this is the way Facebook is leaning.
That's worth considering in your Facebook marketing and outreach process moving forward.
Also, keep an eye on your Facebook reach stats for any significant changes.