Here's a case of serendipitous timing - this week, as Facebook was being praised for its efforts to stop the spread of a new anti-vax promotional video, which aimed to highlight COVID-19 conspiracy theories, a new report was released which showed that Facebook has facilitated the spread of a wide range of health misinformation, and likely contributed to the COVID-19 death toll as a result.
According to an investigation conducted by human rights group Avaaz, Facebook Pages spreading health misinformation saw a cumulative 3.8 billion views in the first five months of 2020.
As per the report:
"Health misinformation is a global public health threat. Studies have shown that anti-vaccination communities prosper on Facebook, that the social media platform acts as a ‘vector’ for conspiracy beliefs that are hindering people from protecting themselves during the COVID-19 outbreak, and that bogus health cures thrive on the social media platform."
The Avaaz team used publicly available Facebook data to track the views of the Facebook Pages of 82 health misinformation spreading websites on the platform between May 28th, 2019 and May 27th, 2020.
The data shows that interest in these sites peaked in April, as the pandemic spread throughout the world.
It has reduced since, which may be attributable to Facebook's expanding efforts to combat COVID-19 misinformation.
But even more concerning is the amount of views these Pages have seen, in variance to those of official health bodies.
The Avaaz report once again highlights concerns about the role Facebook plays in spreading dangerous misinformation - which is even more significant when you also factor in that the majority of Americans now get at least some of their news content from social media, with Facebook being the leading source.
And that's only likely to increase. The pandemic has forced the closure of a range of regional and local news publications, leaving many people with no source of official, local news. So where do you think they'll turn to instead?
Given Facebook's scale and reach, the fact that it provides a platform for these fringe beliefs and movements is a significant concern, and should be investigated. How, exactly, the situation can be rectified is difficult to say, especially given that Facebook would prefer to lean on the side of free speech in order to maximize user engagement.
And the fact is, as these stats show, more salicious, counter-mainstream narratives gain traction on the site. Facebook would argue that this is a people problem, not a platform one, and that the data simply reflects people's interests. But again, given Facebook's potential for amplification, the concern is relevant.
And as noted by MIT Technology Review:
"A study in the American Journal of Tropical Medicine and Hygiene last week found that around the world, at least 800 people may have died and 5,800 been admitted to hospital as a result of coronavirus misinformation in the first three months of 2020, many of them after drinking methanol or cleaning products that they believed could cure COVID-19."
This is not harmless banter, there are real-world consequences stemming from such movements.
The challenge then is where do you draw the line, as wherever Facebook sets its boundary, people will always push it to maximize views and engagement.
Who then decides what's okay and what's not - and if Facebook does take a harder line, would that see it lose relevance, and audience, as a result?
In related news, Facebook has today announced that it will now allow some ads for hand sanitizer and disinfectant wipes once again. Facebook banned ads for these products back in March due to concerns over COVID-19 profiteering.