As we head into another US Presidential Election cycle, the debate over the spread of fake news, and who's responsible for it, is once again set to become a key focus.
All platforms are working to reduce their impact in this regard, but YouTube, in particular, has a significant role to play. With so many people now getting news content and information from the online video giant (around one in five YouTube users say that the platform helps them understand what's happening in the world), it needs to ensure, where possible, that it has measures in place to reduce the flow of misleading content, while also amplifying relevant, accurate news and information.
This has been a key focus for the platform in recent times - as explained by YouTube in a new update this week:
"Over the past couple of years, we've been working to raise authoritative voices on YouTube and reduce the spread of borderline content and harmful misinformation. And we are already seeing great progress. Authoritative news is thriving on our site. And since January 2019, we’ve launched over 30 different changes to reduce recommendations of borderline content and harmful misinformation. The result is a 70% average drop in watch time of this content coming from non-subscribed recommendations in the U.S."
YouTube has mapped out their various updates on this front in this chart:

Definitely, those results are promising, and YouTube continues to add in new measures to reduce the spread of misinformation, or dispell concerning trends that are not grounded in fact.
For example, YouTube now also shows information panels on content 'prone to misinformation', which provides links to relevant resources for more insight.

YouTube also says that it's working to prioritize "authoritative voices" for news and information queries in search results and “watch next”:
"For example, try searching for “Brexit.” While there will be slight variations, on average, 93% of the videos in global top 10 results come from high-authority channels."

YouTube's also working to address concerns with borderline videos - content that "comes close to, but doesn’t quite cross the line of" violating its Community Guidelines. YouTube says that such videos make up a tiny proportion of its overall viewership, but it's now expanding its program of reducing recommendations of borderline content "or videos that could misinform users in harmful ways" into more regions.
As noted, this is an important area for YouTube, because an increasing amount of people now come to the platform for information, and can be lead down concerning rabbit holes by the content recommended to them, relative to their search requests.
Earlier this year, The New York Times published a profile of YouTube user Caleb Cain, who claims that he was radicalized by the platform, sinking further and further into conspiracy theories and extremist views with each tap on his 'Up Next...' recommendations.
A poll conducted by Pew Research last year also showed that around two-thirds of YouTube users (64%) "at least sometimes" encounter videos that seem obviously false or untrue while using the site.
As noted, YouTube has been testing out various ways to reduce these impacts, which has also included hiding comments on certain videos to reduce the related discussion. The results show that the platform is seeing positive benefits from such tests and changes - though the real test will come as the Presidential spin cycle hits top gear.