As it continues to refine its approach to misinformation and disinformation actors, and detecting networks seeking to use its massive audience reach for their own political purposes, Facebook has today launched a new, 44-page report on the various coordinated networks that it's detected since 2017, in the wake of revelations around the Cambridge Analytica and Russian-backed interference efforts in the US.
Facebook's full 'Threat Report on Influence Operations' outlines the scope of its efforts on this front, and the scale of the actions that it's taken to stamp out ill-intentioned activity by various groups.
And there are some interesting insights - for example, Facebook says that:
"...from 2017 through mid-2021, we have taken down and publicly reported on over 150 covert influence operations that violated our policy against Coordinated Inauthentic Behavior (“CIB”). They originated from over 50 countries worldwide and targeted both foreign and domestic public debate."

The full overview provides more context on the exact nature and focus of these campaigns - including the audience focus of each, and how that's evolved over time.

As you can see here, such operations are increasingly becoming more enclosed, as groups look to use Facebook and Instagram to manipulate domestic politics. In 2017, however, these were all focused on foreign campaigns - which may suggest that the broader awareness of how Facebook can be used for such purpose has since been adapted into smaller scale pushes, while larger, potentially easier to detect (based on IP, location tracking, etc.) foreign influence campaigns have declined year-over-year.
In terms of which nations were the biggest culprits, according to Facebook's findings, it may or may not surprise you that Russia has been the key originator of such efforts.

Russia's now-infamous IRA - or 'Internet Research Agency' - has been linked to many, many online influence operations, on Facebook and across other platforms, and in many ways, it's lead the way in highlighting how social platforms can be used for such purpose.
Of course, we don't know how effective any of those efforts have been (some research suggests they had little impact), but the IRA, which is backed by the Kremlin, has clearly been working to test its capacity in this respect, and learn what can potentially be achieved through targeted social media push campaigns, aligned with specific political objectives.
Given this, it'll also come as little surprise to see which country has been the most common target of these efforts.

The domestic chart, however, is interesting. Facebook has been a key source of angst in smaller regions, like Myanmar, where it's been linked to major political misinformation and influence efforts. A key concern in some of these regions, which are still in the midst of the first wave of their digital transformation, is that Facebook can quickly become the source of truth, and can subsequently end up with an outsized influence on community beliefs and behaviors, simply by virtue of its size and usage.
Facebook has been working to get ahead of this, in various ways, and will need to continue to invest in digital literacy education as it moves into new regions, like Africa and remote parts of that nation which are only now getting access to the internet.
That's also why various nations have also raised concerns about Facebook becoming available to these audiences, as they've seen the divisive impact it can have in other regions. Facebook's Free Basics internet access program, for example, was opposed by various governments, many of whom aren't able to facilitate internet connection in any other way, simply because of concerns around the potential dangers of Facebook access, and the power it could give The Social Network as a result.
But then again, some regions are also opposed to broader web access due to their own information restriction processes, which form a key part of their power structures. Yet even so, it is interesting to note the domestic activities highlighted here, and to consider how more domestic groups are looking to influence local politics via Facebook's platforms.
Which is also becoming more common - as Facebook notes:
"We anticipate seeing more local actors worldwide attempt to use IO tactics to influence public debate in their own countries, further blurring the lines between authentic public debate and deception. In turn, technology platforms, traditional media and civil society will be faced with more challenging policy and enforcement choices."
Facebook also notes that as its detection programs have improved, many groups have shifted to "narrower “retail” campaigns that use fewer assets and focus on narrowly targeted audiences", further pointing to increased use in domestic politics.
Facebook also says that these groups are expanding their focus to a broader set of networks as another means to avoid detection.
"By running operations on multiple platforms, threat actors are likely trying to ensure that their efforts survive enforcement by any given platform. They’ve also targeted hyper-local platforms (e.g. local blogs and newspapers), to reach specific audiences and to target public-facing spaces with less-resourced security systems."
It's an interesting overview of Facebook's evolving detection and enforcement efforts, and the similarly evolving tactics being employed by such groups around the world.
And those tactics will indeed keep evolving. Facebook now controls the largest interconnected network of humans in history, and through that, more groups will continue to try and find new opportunities to influence Facebook and Instagram users for their own gain.
I mean, that's what's Facebook marketing is essentially all about, right? So on one hand, Facebook, through its own ad tools and options, actually facilitates such activity for certain purposes, while also stamping it out on the other when used for dishonest and politically manipulative means.
It's an important focus, because we've seen the divides that Facebook can amplify first-hand, in the case of riots and civil unrest sparked by online discussion.
As such, it remains critical for Facebook to keep evolving its approaches, and countering such efforts where possible.
You can read Facebook's full 'State of Influence Operations 2017-2020' report here.