YouTube has confirmed that it will reduce the recommendation and distribution of videos which promote conspiracy theories that link the spread of COVID-19 to 5G technology.
This comes after a spate of attacks on cell phone towers in some regions - according to The Guardian in the UK, signal towers in Birmingham, Merseyside and Belfast were set on fire in the last week, in attacks that have been linked to the rising theory. Mobile carrier workers have also reportedly been subjected to abuse due to concerns about 5Gs role in the pandemic.
The circulating rumor suggests that 5G signals exacerbate the spread of the virus. The core concept relates to the use of 5G in Wuhan, where COVID-19 originated from. Incidentally, so the theory goes, Wuhan is also China's first region to get full 5G coverage, which was significantly ramped up in October last year, ahead of the outbreak. Scientists have debunked the idea, noting that many regions of China have 5G coverage, in addition to Wuhan, while COVID-19 is also spreading fast in many regions that don't yet have 5G infrastructure. Yet, the theory has been gaining momentum, with even celebrities like actor Woody Harrelson re-sharing the concept.
YouTube says that it will remove any content which violates its regulations, while it will also significantly reduce the reach of any 'borderline' videos, which push conspiracy theories but don't cross the line.
The controversy is the latest of various content headaches of this type for the platform in recent times - if you're looking for conspiracy theories and internet rabbit holes to tumble down, YouTube is likely where you'll eventually end up.
The online video giant has become known for hosting left-of-center content, while its algorithmic recommendations can drag people further in, reiterating such ideas by showing you more, similar content.
Indeed, last year, The New York Times profiled a 26 year-old man who had been 'radicalized' by YouTube content, highlighting concerns with the platform's 'Up Next' prompts which, he says, had lured him deeper and deeper into violent, extremist views.
YouTube has been working to address this. Last January, YouTube announced that it would limit recommendations of content which came close to violating its Community Guidelines, but didn't quite cross the line. The examples YouTube provided in that instance were videos relating to miracle cures and conspiracy theories, including those about 9/11 and 'flat earthers'.
YouTube further outlined improvements to its recommendations algorithm to reduce such impacts in June, while at the same time, it also ran a test which would see all video comments on the platform hidden by default, with users needing to tap a button to view any related discussion.

That sought to address another element of concern, relating to predatory behavior on the platform, but the changes overall show that the problems of indoctrination and radicalization are a serious concern - and with more than 2 billion monthly active users, YouTube's influence in this regard can be significant.
It makes sense, then, that YouTube is moving to limit the reach of COVID-19/5G conspiracy theories. The risks here are significant - on the one hand, they falsely identify an issue that's not the cause of the virus' spread, which could dilute the messaging from public health authorities. While on the other, if cell towers are being damaged, that could limit the capacity for authorities to provide all regions with relevant, timely updates.
Information is key in combating COVID-19 - people need to know what they should be doing, what they shouldn't, and how they can contribute to limiting its spread.
It may seem like a crazy theory, like something that makes little sense. But a well-made YouTube video can add significant credence to such concepts - and when you also consider that the platform's recommendation algorithm will try to show you more of what you're interested in, you can imagine that some people are seeing a whole row of additional videos pushing the same theory, adding further weight to the idea.
It's important for YouTube to take action - and not just in this instance, but in all idea chains that can lead to dangerous behavior.
UPDATE (4/7): Facebook has also announced that it will now remove posts relating to 5G theories in relation to COVID-19.
UPDATE (4/22): Twitter has also updated its policy which will see the removal of tweets which "could lead to the destruction or damage of critical infrastructure, or could lead to widespread panic, social unrest, or large-scale disorder". This includes many 5G/COVID-19 claims.