Fresh of another controversy related to the potential exploitation of children on the platform, YouTube has announced that it will now take a stronger, more definitive stance against 'hateful and supremacist' content on its platform, by banning all videos which promote discrimination or exclusion against any specific group.
As explained by YouTube:
"Today, we're taking another step in our hate speech policy by specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status. This would include, for example, videos that promote or glorify Nazi ideology, which is inherently discriminatory."
That is a major ruling, which will impact thousands of YouTube channels. It'll also see various channels caught in the crossfire, unleashing a flood of challenges and appeals, which could take some time for YouTube's team to work through.
Case in point - journalist Ford Fischer claims that his channel has already been demonetized as a result, despite his videos documenting activism from a journalistic standpoint.
Within minutes of @YouTube's announcement of a new purge it appears they caught my outlet, which documents activism and extremism, in the crossfire.
— Ford Fischer (@FordFischer) June 5, 2019
I was just notified my entire channel has been demonetized. I am a journalist whose work there is used in dozens of documentaries. pic.twitter.com/HscG2S4dWh
No doubt there'll be a range of cases like this - which, as noted, could take YouTube a long time to sort through.
In fact, YouTube acknowledges that some content will be categorized incorrectly as it seeks to enforce this tough new stance:
"We recognize some of this content has value to researchers and NGOs looking to understand hate in order to combat it, and we are exploring options to make it available to them in the future."
Also now on YouTube's banned list - content which denies well-documented events:
"Finally, we will remove content denying that well-documented violent events, like the Holocaust or the shooting at Sandy Hook Elementary, took place."
The announcement addresses various concerns which have been raised with YouTube in recent times.
Back in January, YouTube announced changes to its recommendation algorithm after reports found that its system was promoting conspiracy theories and fake news content, amplifying such misinformation.

As noted, YouTube has also been caught up in investigations over the alleged spread of video clips which feature children, which are being shared among online pedophile groups through the platform.
These are major areas of concern, and a significant part of the problem in each case is YouTube's algorithm, which shows users more of what they like. But because machine learning systems have no capacity for judgment, it doesn't matter to their calculations what that content is or might be.
If you like videos of children, YouTube will show you more of the same, which is a logical, yet potentially concerning, and dangerous, process. YouTube could look to lessen the impacts of its algorithm, or even remove its algorithm recommendations entirely, leaving it up to users to decide what they watch, but that would also likely reduce the time users spend on platform. So if that's not an option, what's the next approach?
Enforcing more content rules will also, theoretically, address the core of the problem, while also enabling YouTube to continue to benefit from its algorithm process. And it's worth noting, YouTube's recommendations system currently drives up to 70% of its video views.
The main problem leading to the amplification of this material is YouTube's algorithm, but given its engagement value, YouTube has been forced to take a different angle. And the impacts of the change will be significant.
That's not to say that such bans shouldn't happen, anything that can be done to reduce the amount of hate speech and division online is a good thing. But the question will be around where YouTube draws the line, and what that means for online freedom.
There is a risk, of course, that the banned broadcasters could merely splinter off and take their followings to other platforms. Maybe DailyMotion or Vimeo will be more accommodating of their views, or maybe they'll take them to a private Facebook group or to Instagram's IGTV, or someplace else which enables them to continue monetizing their content.
Maybe the more prominent creators will work out their own video hosting solutions on their own websites, or they could all band together, along with people like Alex Jones, to build a whole new network of their own. YouTube's likely not overly worried about that possible scenario, but the problem is that these users already have large followings, they can still share their views on other platforms, and through other means.
Of course, YouTube can only do so much, they can only eliminate such material from the platforms they control. And while it'll certainly lessen the concerns around the more controversial elements of YouTube, it does open the door to increased action, with users now given a larger window to complain about discrimination and abuse, which may make it harder to stay within YouTube's rules.
In essence, YouTube's new stance is a good thing, such views and perspectives shouldn't be up for amplification and monetization - which, really, is exploitation in many cases. But the margins around what YouTube deems acceptable just got significantly smaller, which also makes the gray areas in between a lot more murky.
Will that be good for an open internet overall? Will that reduce or exacerbate the problem in other forms?
Only time will tell.