With COVID vaccination programs ramping up, Facebook has announced a range of new measures to help promote vaccine take-up, while it's also expanding its restrictions on anti-vaccine content across its platforms to limit the impact of harmful, anti-science messaging.
First off, Facebook's adding a new element to its COVID-19 Information Center in the US, which will alert people as to when they're eligible to be vaccinated, and where they can do so.
The prompts will help boost awareness of local vaccine roll-outs, and will be expanded to more countries in the coming weeks.
In addition to this, Facebook is also allocating $120 million in ad credits to help health ministries, NGOs and UN agencies reach more people with COVID-19 vaccine and preventive health information. Facebook will also provide additional training and marketing support to help governments and health organizations maximize Facebook for their vaccine messaging, which will also include training on WhatsApp, furthering reach.
Facebook's also expanding its COVID-19 Information Center to Instagram to further boost information sharing.
Misinformation is also a critical concern, and Facebook is additionally looking to step up its fight against false claims, with expanded rules around what qualifies as vaccine misinformation, and will subsequently be removed from its apps:
"Following consultations with leading health organizations, including the WHO, we’re expanding the list of false claims we will remove to include additional debunked claims about COVID-19 and vaccines."
The expanded list will include claims that:
- COVID-19 is man-made or manufactured
- Vaccines are not effective at preventing the disease they are meant to protect against
- It’s safer to get the disease than to get the vaccine
- Vaccines are toxic, dangerous or cause autism
People that continue to share such claims will run the risk of full bans from Facebook's apps, while Facebook will also push some group admins to remove questionable posts in line with the updated rules.
Worth noting, too, that Facebook's updated approach to vaccine misinformation comes as a result of its newly appointed Oversight Board, which, in one of its first rulings, criticized the platform's “inappropriately vague” rules around health misinformation. That sparked Facebook to update its approach, which is a positive sign for the influence of the new Oversight body.
Facebook's updated list of outlawed claims is available here.
In addition to this, Facebook is also looking to provide more insight into vaccine and vaccine messaging trends to help health authorities maximize response.
"To to help guide the effective delivery of COVID-19 vaccines, survey data will provide a better understanding of trends in vaccine intent across sociodemographics, race, geography and more. The scale of the survey will also allow for faster updates on changes in trends, such as whether vaccine intent is going up or down in California in a given week and better insights on how vaccine intent varies at a local level."
Facebook has shared some of these insights on its Research blog, underlining community responses to COVID mitigation efforts.
By providing more understanding of where, for example, COVID vaccine resistance is high, Facebook can help organizations refine their messaging to boost take-up, and maximize response rates.
It's good to see Facebook stepping up its efforts in this respect, and while it has been heavily criticized for allowing health misinformation to be distributed on its platforms in the past, Facebook has clearly recognized the need for unity on such at this time, and the potential harm that can be caused by allowing incorrect information to spread.
Maybe that will be a lesson for all forms of harmful misinformation moving forward, and we'll see Facebook take more, similar steps to outlaw that also.
Maybe. We'll have to wait and see.