After a range of reports highlighting significant concerns related to YouTube content aimed at kids, the platform has taken the dramatic step of disabling comments on "tens of millions of videos that could be subject to predatory behavior".
In a new blog post, YouTube has announced the action, which is part of its ongoing process to address concerns following the discovery of a pedophile ring that had been using the platform to find and share clips featuring young children in states of undress.
As per YouTube:
"We know that many of you have been closely following the actions we’re taking to protect young people on YouTube and are as deeply concerned as we are that we get this right."
The decision follows YouTube's recent removal of more than 400 channels and their related activity, and represents a significant ramp-up in response from the platform, which has repeatedly come under fire over its content moderation processes in recent times.
Few could argue that there's a more pressing concern - protecting children and young users is something that all platforms need to take seriously, and while disabling comments entirely for such content is a drastic measure, definitely, if it helps, YouTube should do so.
YouTube does note that:
"A small number of creators will be able to keep comments enabled on these types of videos. These channels will be required to actively moderate their comments, beyond just using our moderation tools, and demonstrate a low risk of predatory behavior. We will work with them directly and our goal is to grow this number over time as our ability to catch violative comments continues to improve."
YouTube also says that it's working on a new 'comments classifier' tool which will more effectively identify and remove predatory comments.
It is a huge concern to see YouTube content being used in such a way, and it underlines, once again, the darker side of the internet, the exposure that all users have to those who seek to exploit their images and content. That its children in this instance just makes it all the worse, and once again highlights the need for all digital platforms to allocate more of their revenue to detecting and removing concerning content, and protecting their users.
That, in itself, has a range of complex challenges (just read Casey Newton's harrowing overview of what Facebook moderators go through), but clearly more needs to be done, more effort needs to be undertaken to identify and eliminate misuse.
In this context, deactivating comments is a small step.