After complaints that YouTube has essentially been promoting conspiracy theories and fake news-type content through its recommendation system, the Google-owned video platform has this week announced an update to its process which will aim to limit the highlighting of uploads that "come close to - but don’t quite cross the line" of violating its Community Guidelines.
As detailed by YouTube:
"We’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways - such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11."
There have been various investigations into how YouTube's recommendation system showcases such content, with violent and offensive videos even being served to kids who are watching via the dedicated YouTube Kids app. At least one researcher has also suggested that "YouTube’s algorithms may have been instrumental in fueling disinformation during the 2016 presidential election", an area which has seen Facebook dragged through various levels of investigation, but for which YouTube has largely remained unconnected.
And certainly, it's not hard to go down a conspiracy theory rabbit hole via the recommended videos on the platform.
The problem could also be exacerbated by YouTube recently rolling out an option to side-swipe, Stories style, over to the next video in your recommendations list, making it even easier to binge on a cluster of clips.
Rolling out this week on the YouTube iOS app (for 6S phones & above)!— Team YouTube (@TeamYouTube) January 15, 2019
From your video player:
????Swipe left for the next video
????Swipe right for your previously watched video
If you prefer, you can still tap for video player controls. More details here → https://t.co/ur4EiHCiii pic.twitter.com/Es2ExHl8m9
YouTube, however, has downplayed the potential impact, noting that "this shift will apply to less than one percent of the content on YouTube". The actual impact is difficult to quantify, but if it sees a reduction in the spread of offensive or untrue content, then it's beneficial - and worth noting, too, YouTube also recently announced a ban on "content which encourages violence or dangerous activities that may result in serious physical harm, distress or death" in response to an influx of users posting 'Bird Box Challenge' clips.
Clearly, YouTube appears to have made cleaning up its platform a priority in 2019. The company also recently provided more insight into how it defines offensive content in regards to potential ad placement.
With digital platforms coming under more scrutiny in relation to the role they play in distributing such material, and inciting offline actions, YouTube looks to be getting ahead of bigger blowback. The platform's recommendations, as noted, have been a known issue for some time, but with government officials now taking a harder look at all online channels, it may only be a matter of time till there are louder calls for industry regulation and review.
YouTube appears to be working to address such concerns before that next wave.