With misinformation about COVID-19 still gaining significant traction across the social media ecosphere - including contradicting reports on mask-wearing, possible cures, potential treatments, etc. - Facebook is adding another tool to try and slow the spread of such posts, with a new pop-up that will appear when users go to share a COVID-19 related post, pointing them to official health resources and providing more context about the post.

As explained by Facebook's David Gillis:
"We've seen good results from our informed sharing screens, and today are rolling out a new treatment for COVID-19 related links. Before re-sharing, you’ll see links' source and date. We hope this helps people get more context at a time and on a topic that is rapidly evolving."
As you can see in this example, the new pop-up will show the user the source publication and how long its been publishing on Facebook. It'll also provide the original date that the article was published, which may help to slow the re-distribution of older posts.
Facebook added similar prompts for general posts when people look to share content that's more than 90 days old back in June.
It's the latest in Facebook's various efforts to curb the flow of COVID-19 misinformation - over the last few months, Facebook has:
- Sought to remove all posts across its platforms which present false claims about cures, treatments, the availability of essential services, and/or the location and severity of the outbreak
- Banned all ads and commerce listings which seek to capitalize in fears, including all listings of face masks, hand sanitizer, disinfectant wipes and COVID-19 testing kits
- Invested more funding into fact-checking resources to give them more capacity to detect and flag potentially misleading posts
- Started removing all non-official COVID-19 accounts from recommendations listings on Instagram, as well as any AR effects related to coronavirus
- Added labels to show people when they've received a forwarded or chain message on WhatsApp
- Set a limit on the number of times messages can be forwarded on WhatsApp to reduce the spread of viral messages (which is also now being tested on Messenger)
- Added new Google search prompts in WhatsApp threads to confirm message details
- Added an official COVID-19 information centre, which highlights official updates from relevant organizations, and is displayed in user News Feeds.
- Improved its machine learning tools in order to better identify and ban accounts engaged in mass messaging
That's an impressive list of additions. If only Facebook went to so much effort with all forms of misinformation, right?
That's a broader question, as to what Facebook deems false or inaccurate news, and how much it can do to address such in each instance. But in this case, COVID-19 misinformation poses a very real and immediate threat to public health, which is why Facebook's working so hard to remove such content.
But really, it can't get rid of all of it. Case in point, recently, right-wing news site Breitbart released a video which portrayed a group of medical 'experts' touting various COVID-19 conspiracy theories. Facebook removed the video for violating its policies, but it was active on the platform for several hours, and garnered some 20 million views before The Social Network took action.
There are various theories as to why it took Facebook so long to act, but the fact is, such content can, does and will be spread on Facebook, and can still gain traction in private groups.
As such, it's good to see Facebook implementing as many measures as possible to address such concerns. But various questions do still need to be posed about its potential to do more to stop activist and extremist groups from gaining momentum via its tools.