No matter how you look at it, YouTube, along with Facebook, has become a key source of news and information for many people, with users coming to the platform to get the latest information from sources and channels that they trust.
That puts both platforms in a difficult position in terms of maintaining free speech, while reducing the spread of harmful misinformation. And both platforms have faced very public challenges with such - but leading into the 2020 US election, and throughout COVID-19, Facebook and YouTube have taken tougher stances on misleading reports, which has had some impact in reducing false narratives and ensuring people are better informed about the latest events.
But more recently, YouTube has once again been criticized over its decisions, specifically in relation to allowing certain videos that have made misleading claims of widespread election fraud, echoing US President Donald Trump's opinion that the 2020 Presidential Election was rigged against him. Indeed, YouTube has allowed several videos posted by One American News Network, despite user complaints, which specifically claim that 'Trump won', amplifying conspiracy theories, despite official evidence to the contrary.
Such claims can have a big impact in fueling civil unrest - which is why today YouTube has announced that it will now remove all content that includes misleading claims about the US election.
As per YouTube:
"Yesterday was the safe harbor deadline for the U.S. Presidential election and enough states have certified their election results to determine a President-elect. Given that, we will start removing any piece of content uploaded today (or anytime after) that misleads people by alleging that widespread fraud or errors changed the outcome of the 2020 U.S. Presidential election, in line with our approach towards historical U.S. Presidential elections."
The action will include videos that claim voting discrepancies "due to widespread software glitches or counting errors".
YouTube's also updating its information and fact check panels to further refute claims of voting errors, which will appear in related searches in the app.
Which will no doubt raise the hackles of President Trump, who's repeatedly called for changes to Section 230 laws in recent weeks due to Twitter, and other social platforms, adding fact-check markers to his election fraud claims.
Now, the very tools which Trump credited as facilitating his win in the 2016 election could end up working against him, as the Trump campaign works to provide evidence to support its claims. Without definitive proof, the platforms are rightfully moving to limit misinformation, and YouTube is taking the next steps in stamping out false claims as it seeks to strengthen its approach to misleading reports.
Which could have a big impact. As noted, YouTube is already a key source of news and information, and the more steps it takes to reduce the spread of misleading or untrue reports, the better. But then again, that is moving into a more editorial approach. Social platforms are private entities, they're entitled to choose what they allow and disallow on their networks. But it will lead to more debate about their influence, and how they choose what users see, and don't, in important debates.
At this time, however, YouTube is taking the right steps. Misinformation of this type can help fuel counter-movements, and cause friction in the transition, which could become a much larger concern if left unchecked. In this sense, YouTube needs to act, as do all social platforms, to ensure that they're working with official information providers to reduce the spread and false reports.
The fact that there's now a debate over what's true and what's not is a concern in itself, but YouTube's latest steps are another evolution in its approach to such, which will have a significant impact.