With the European Parliament elections being held next month, Facebook has announced that it will facilitate a new series of digital literacy and education courses for journalists within the related regions as part of its broader efforts to stamp out misinformation and efforts by political groups to manipulate voters through The Social Network.
As explained by Facebook:
"Through the Facebook Journalism Project, we want to help journalists globally to develop their skills for the digital age, whether they’re covering elections, telling stories in new formats, or deepening relationships with their readers."
Facebook will host 10 dedicated events, in which it plans to train around 750 journalists on how to tell digital stories, how to maintain integrity in digital reporting, and, maybe most important, how to spot false news.
Part of the problem with the increasing speed of the media cycle in the social media era is that in the rush to get in first, and break the latest news, some false reports have been amplified by larger, reputable outlets. A key example happened in 2013, when outlets including CNN and Fox News reported false information in the wake of the Boston Marathon bombing, while many conservative reporters have been keen to highlight, in the wake of the Mueller Report findings, that various news organizations published false reports during the ongoing investigation.
Such cases work to sow doubt in the minds of voters as to what is and isn't trustworthy information, and while the vast majority of reporters are indeed reporting the facts, the push to get in quickly can, and has, lead to errors.
Facebook's digital literacy course aims to address this, at least in some capacity, while also helping journalists better understand how to distribute their reportage to keep voters better informed.
Among the elements Facebook will be teaching are immersive storytelling and engaging your audience through Facebook Groups and partnering with fact-checkers to clarify questionable reports.
"Protecting the integrity of elections while making sure people can have a voice is a top priority for Facebook. Our tactics on platform include blocking and removing fake accounts, finding and removing bad actors, limiting the spread of false news and misinformation, and bringing unprecedented transparency to political advertising. Our approach to this problem — like the problem itself — is multifaceted and we know we still have much to do."
Facebook's training sessions will be held in Belgium, Denmark, Finland, France, Germany, Ireland, Italy, Poland, Spain and Sweden.
In addition to this, Facebook has also created a series of free online courses with the Poynter Institute, presented in 14 different languages, including English, Spanish, French, German Portuguese and Italian.
"More than 100,000 people have participated in these courses to learn how to leverage technology to better reach audiences. Journalists can participate in these online resources to learn about connecting, building, and leveraging their audience for effective storytelling."
After the controversies of the 2016 US Presidential Election, and the subsequent scrutiny of Facebook and its influence over the distribution of news and information, The Social Network has been working hard to provide new ways to limit such misuse, without significantly impacting its business model. Part of the problem, of course, relates to Facebook's News Feed algorithm which shows users more of what they're likely to agree with, and less of what they won't. That can lead to more selective bias, and amplify divides - but the News Feed algorithm is also what's helped Facebook grow to more than 2.3 billion active users, and boost activity and engagement on the platform.
Would Facebook be better without an algorithm? Are people now digitally literate enough to alleviate the need for Facebook to automatically filter content, in order to show them more of what they want, while hiding what they don't?
The argument could be made that Facebook's algorithm is largely to blame for its troubles in this respect - but lessening its influence, or removing it entirely, is likely not something that Zuck and Co. would even entertain.
As such, the next best thing is education on digital literacy. It won't necessarily solve the platform's misinformation problems, but it may help in invoking a level of healthy skepticism in users, and better highlighting what not to believe, and why.