It must feel a little like deja vu at Facebook right now.
Earlier this month, reports began circulating that The Social Network was working on a new, dedicated news tab, with the company reportedly offering news outlets "millions of dollars" for the rights to put their content in this new news section. Now, the next element of Facebook's news offering has been revealed - and it's starting to sound a lot like the Trending News section which Facebook got rid of last year.
According to The New York Times, Facebook will hire a team of journalists to work on its news section, with the positions for these new roles now being advertised on Facebook's website:
Note the mention of 'credible content' - a key aspect of Facebook's news refresh is that it will be built on 'trusted' news sources, helping Facebook to guide the 43% of US adults who now get some news content from its platform towards the right information on any given topic.
That's what Facebook CEO Mark Zuckerberg emphasized in his initial discussion of the idea back in April - during a sit-down interview with Mathias Döpfner, the CEO of German publishing house Axel Springer, Zuckerberg outlined the plan for a Facebook news tab, which would better showcase content from 'high quality, trusted' sources.
As noted by Zuckerberg (around the 9-minute mark):
"We want this to surface high quality and trustworthy information, so, of course, anything that we do is going to be personalized, but there's a question I have, which is what is the level of curation that we should have in order to - and we're not going to have journalists making news.. [...] What we want to do is make sure that this is a product that can get people high-quality news."
That's a logical aim, but it also begs the question - who exactly will decide what's 'quality' and/or 'trustworthy' info?
That's where these new journalist resources will come in, and while the hiring of trained journalists won't necessarily eliminate concerns of bias - which, incidentally, Facebook is also looking to tackle through a separate investigation and project - it will help to ensure the new section won't see the same problems that Facebook's entirely algorithm-defined Trending News section experienced after Facebook got rid of its manual editors amid claims of internal bias and algorithm tampering back in 2016.
So it's back to the future for Facebook - a return to the system it essentially had in place for Trending News initially, though with more lessons learned, and more emphasis on internal specialists, as opposed to contracted curators.
The Information has additionally reported that Anne Kornblut, a Pulitzer Prize–winning former editor for The Washington Post, is helping to lead Facebook’s effort to hire its new team of editors.
As per The Information:
"Facebook plans to use algorithms to curate most of the stories in the forthcoming news tab, which is slated to debut before the end of the year. Human editors will select breaking and “top” news stories, a spokesperson told The Information."
So, we know that Facebook is hiring experts, that it will utilize algorithms to a degree, and that it will seek to make a significant push into news content sometime soon. The last thing we need to know now is what, exactly, its new News section will look like.
In Zuckerberg's initial imagining of a refreshed Facebook news surface, his vision was for a Facebook Watch for news:
"One of the things that's really worked over the last year or two is we've launched [Facebook Watch] for video, where people who weren't getting all the video they wanted in News Feed could go to a place that's a dedicated space to get video. Because that has started to really grow quickly, we've decided that there really is an opportunity to do something like that with news as well."
Facebook Watch is in an entirely new tab within the app - could that mean that Facebook News will also be the same? It seems more likely that Facebook, given the significant investment it's pushing into the project, and the relevance of breaking news content in particular, would want to showcase at least some news content in the main feed, leading people into the news section.
It could, for example, look like the alternate subject feeds which Facebook tested back in 2017:
Those entirely algorithm-defined listings were not particularly useful, and Facebook abandoned the project after a brief test period. But maybe, a curated set of swipeable feeds could work.
Interestingly, this is also similar to the format that Twitter is trying with its new topic-based feed listings, which will essentially be an extension of its recently launched swipeable content streams based on lists.
Customizable timelines that are easy to access? We’re thinking about ways to do this! One idea we had is for you to be able to swipe to your lists from home. If you're in the test tell us what you think! pic.twitter.com/g5WMaNZ57N
— Twitter (@Twitter) June 25, 2019
Twitter's new topic streams will also be curated, another move away from entirely algorithm-defined streams for news content. Essentially, what the major platforms have found is that while algorithms most definitely boost engagement, by helping to direct relevant content to users, they can also help fuel inherent bias, and facilitate concerning movements.
That, ideally, is what algorithms were designed not to do - they were at least partially put in place by the platforms so that they could take more of a hands-off approach to the content being distributed, as it's not being defined by them or their beliefs, but by us, the users, so whatever comes through is what we are sharing.
Given that such process has resulted in questionable content being widely distributed anyway, it makes sense that the platforms need to take a more hands-on approach to editorial, and filtering out the bad elements.
Will that lead to a better outcome? For users, you would assume it will, as it will theoretically help to reduce misinformation and the sharing of outright lies. But as we've seen in more recent times, truth, especially on social media, is in the eye of the beholder.
When people don't even trust established facts - like, say, the shape of the earth - who's to say what's actually 'trusted' information?