Last week, YouTube CEO Susan Wojcicki outlined the platform's ongoing efforts to address key areas of concern in her latest quarterly update.
Among Wojcicki's notes, she explained the platform's "four R's" guidelines which direct its processes in this regard:
This week, YouTube has provided a specific update on the first element listed - removing inappropriate content, and how it's been working to improve its efforts on this front.
As per YouTube:
"We've been removing harmful content since YouTube started, but our investment in this work has accelerated in recent years. Because of this ongoing work, over the last 18 months we’ve reduced views on videos that are later removed for violating our policies by 80%, and we’re continuously working to reduce this number further."
That's a significant result - to provide more context, YouTube has also provided this visualization of its evolving removal processes.
One of the biggest areas of concerns in this respect is hate speech, and YouTube has also made significant advances here.
"We spent months carefully developing the [hate speech] policy and working with our teams to create the necessary trainings and tools required to enforce it. The policy was launched in early June, and as our teams review and remove more content in line with the new policy, our machine detection will improve in tandem."
Indeed, according to YouTube's stats, its hate speech policy has seen the removal of more than 17,000 accounts.
These are significant numbers, underlining YouTube's ongoing commitment to user safety, and improving the experience for individuals and businesses, in regards to ad placement.
Among other key stats:
- YouTube says that over 87% of the 9 million videos it removed in the second quarter of 2019 were first flagged by its machine learning systems.
- An update to its spam detection systems in the second quarter of 2019 lead to a more than 50% increase in the number of channels terminated for violating our spam policies.
- The almost 30,000 videos removed for hate speech over the last month "generated just 3% of the views that knitting videos did over the same time period".
YouTube has come under more scrutiny in recent times amid concerns that users can fall down rabbit-holes of misinformation, and essentially become radicalized as a result.
Back in June, The New York Times profiled YouTube user Caleb Cain, who had been drawn further and further into far-right conspiracies by YouTube's content recommendations.
As noted by NYT:
"The common thread in many of these stories is YouTube and its recommendation algorithm, the software that determines which videos appear on users’ home pages and inside the “Up Next” sidebar next to a video that is playing. The algorithm is responsible for more than 70 percent of all time spent on the site."
The case highlights YouTube's significant influence, and the influence of its algorithm in particular, which is why its work to improve its processes in this respect is so essential.
Clearly, based on these stats, YouTube is taking that challenge seriously, and while it'll never be able to totally eliminate negative elements from infiltrating its site, the stats here suggest that it is improving, and working to limit misuse.