According to a new report from The Reuters Institute for the Study of Journalism, which incorporates the responses of more than 74,000 people in 37 countries about their digital news consumption habits, the use of social media for news is declining, almost entirely driven by people turning away from Facebook.
As per the report:
“In the U.S., for instance, 39% of people said they used Facebook as a source of news in 2018, down 9 percentage points from 2017. And if you look just at young people in the U.S., their use of Facebook for news is down by 20% compared to 2017.
Again, this is self-reported, there’s no definitive usage data from Facebook to back such trends up. But they do make a lot of sense.
Facebook has obviously lost a lot of consumer trust, particularly as a news source, following the revelations of how the platform has been used by Cambridge Analytica and other politically-affiliated organizations to influence voter behavior. The fact that Facebook can be used for such purpose – which their own research proves – has understandably made some more wary of the content shared on the platform. But even aside from news content in isolation, the study data also highlights a broader social media usage trend.
In their further examination, the Reuters researchers found that people are now far more comfortable sharing their discussions on WhatsApp – “whose use for news across countries has almost tripled since 2014 - though, in the U.S., only four percent of respondents said they get news from it”.
The researchers found that Facebook and other social platforms (like Twitter) remain key sources for news content, but people are increasingly likely to find links there, then share them on WhatsApp, which keeps their discussion more contained, and frees them from potential outside judgment – and a permanent record of their thoughts.
That’s indicative of a broader social trend – as we reported recently, more and more social media conversations are now switching to messaging apps.
Even though Facebook owns both WhatsApp and Messenger, the enclosed nature of these discussions – or indeed, the fact that they’re not public – has lead to more people migrating their interactions across into more intimate groups.
The divisiveness of social platforms has become increasingly more apparent – people, in many ways, are forced to choose sides in public discussions, which, along with algorithm shifts, had lead to the development of filter bubbles, gaps in public conversation where people feel safe, where their perspective is largely reinforced. In private chats, you avoid the same effect, though they may still be just as isolating.
Either way, the stats don’t lie, more and more people are using messaging apps for their discussions, and Facebook, according to user sentiment at least, is losing some ground. Whether we see similar in actual usage stats, we won’t know till their next performance report is released.
For their part, Facebook is working to remove fake and misleading news, and improve the reputability of the content shared on its platform. In an update posted this week, Facebook Product Manager Tessa Lyons detailed how the evolution of their third-party fact-checking system is helping to eliminate bad actors.
“We started the third-party fact-checking program in December 2016. Now we have 25 partners in 14 countries, many with recent or upcoming elections. Our partners are independent and certified through the non-partisan International Fact-Checking Network. When fact-checkers rate an article as false, we show it lower in News Feed — reducing future views by over 80% on average.”
That sounds like a positive step - but then again, it may also be helping to fuel ongoing divisiveness. Freedom of speech advocates question how ‘non-partisan’ the fact-checkers actually are, which again leads to the growth of filter bubbles, inspired by what some may see as editorial interference.
But there’s not really a heap more Facebook can do – they are working to improve the quality of news shared on the platform, and they recently removed the ‘Trending’ section in favor of future news developments (most likely Facebook pushing their new, exclusive news programs on Watch).
Given the sentiment data, you can see why Facebook is now moving to take such drastic action, after initially dismissing suggestions that their platform could have influenced the 2016 US Presidential Election as ‘a pretty crazy idea’.
Teens say they’re using it less, people say they no longer trust the news content on the platform. The signs suggest that Facebook still has a lot of work to do to win back consumer trust.