With misinformation causing significant division and disagreement online, and leading to widespread confusion around major issues, Facebook has launched a new education course, in partnership with Reuters, which aims to teach journalists what to look for in the use of manipulated media.
As reported by Axios:
"Facebook is spending six figures to fund a course on manipulated media and deepfakes for newsrooms, executives. The course material has been developed by Reuters, and Facebook is funding its international expansion as a part of the Facebook Journalism Project."
The course itself, which is publicly accessible here, provides an overview of the growing issues around manipulated media, including deepfakes, which Facebook has also launched a separate, dedicated research project around in order to get ahead of the use of manipulated video.
But it's not just deepfakes that Facebook's looking to address - the course focuses on five different types of manipulated media:
And while deepfakes do look set to become a bigger problem in future, especially as their accuracy improves, it's often these other elements that cause major misunderstanding. Take, for instance, 'Lost Context', which Facebook says is the most common type of manipulated media globally.
You've no doubt seen this yourself, as this type of manipulation tends to gain viral traction - take, for example, this video that saw a huge boost in viral traffic on Facebook last year.
The video purports to depict a Muslim refugee vandalizing a religious statue in Italy, and as you can see, it's been viewed millions of times, and shared just as much, further stoking religious tensions.
But that's not what this video actually shows. The incident depicted actually happened in Algeria, a majority Muslim nation, years earlier, and the man's not attacking a Christian statue, but the statue on the Ain El Fouara fountain, which depicts a naked woman, which he believes is indecent. The same statue has been vandalized several times for the same reason, because many Algerians see the depiction as distasteful.
This example shows how easy it can be for re-purposed media like this to spark unjustified societal tension - the video, when re-shared in the above manner, falsely depicts concerning differences in our approaches, and how Muslim people don't respect other religious beliefs. The re-purposed video was shared by many anti-Muslim groups to push their agenda - and at a basic glance, you can see how Facebook users could be tricked into seeing something, with their own eyes, that's completely untrue.
This is a major problem - and as noted, this type of manipulation doesn't even rely on high-tech trickery to convince viewers of an alternate interpretation.
The case highlights the need for such an education course, as we look to wade through the sea of misinformation in order to get to the truth of each issue. And that's becoming harder to do - which is why, in many ways, it seems strange that Facebook has opted to exempt political ads from fact-checking.
Political content manipulation is actually used as an example in this new course - in the 'Edited Media' section, the course uses the now infamous Nancy Pelosi doctored video to highlight the concern.
Under Facebook's political ads stance, political groups could actually use this exact video in their campaign, as Facebook won't subject them to fact checks.
That seems problematic, right?
Of course, there are many other ways in which manipulated media is being used to misrepresent certain issues, and as we see more doubt cast over scientific facts and actual events, it's important for us, as consumers, to become more aware of such manipulation. Which is why this course is valuable, and an important initiative for newsrooms to undertake.
And really, it may well be worth members of the public taking also. The course can't account for all types of content manipulation, but it does raise awareness of the various ways in which we're being tricked - which, at the least, can help to imbue a healthy sense of skepticism around such.
This is an important area, and its good to see Facebook at least seeking to take steps towards improvement.