The biggest social media story of the moment – and maybe ever – is the latest controversy surrounding the misuse of Facebook data by controversial advertising/lobbying group Cambridge Analytica.
To quickly get you up to speed, former Cambridge Analytica staffer Christopher Wylie has come forward with allegations that the company used personal information on American voters, gleaned from Facebook by academic researchers, in order to hyper-target Facebook ads and content which aimed to manipulate people’s psychological leanings to influence the outcome of the 2016 US Presidential Election. The same strategies were reportedly used to shift perception in the UKs Brexit vote.
In many ways, the revelations are not all that surprising – ever since the election of Donald Trump, people have been trying to work out how, exactly, it happened - but the direct linkage between Facebook data and the mass-manipulation of society has raised a whole new set of crucial questions. Political leaders are calling for Facebook to come under more scrutiny, officials are saying the company could be subject to hefty fines and the public, more broadly, is no doubt re-considering the amount of trust they place in the network.
Will that lead to fewer people using Facebook? Will we see Facebook come under more regulatory restrictions? And what, exactly, is the extent of what’s possible with Facebook’s data.
Here’s an overview of the implications.
Data Risks
First off, we’ll start with the pressing question – can your Facebook data be used to influence your opinion, without you even realizing?
The answer, undoubtedly, is yes.
Back in 2015, I interviewed Dr. Michal Kosinski, who was part of a research team that had conducted a very similar experiment to the one reportedly manipulated by Cambridge Analytica.
In their study, Kosinski and his team used the results of a hundred question psychological study, which had been completed by more than 86,000 participants through a Facebook-linked app, to map the responses alongside each users’ respective Facebook likes, which were also accessible through the app’s permissions.
Based on these correlating data points, the research team was able work out a range of psychological traits based on commonalities, with their results showing more predictive accuracy than family, friends, even partners of the subjects.

Using these insights, the team was able to establish baselines for a range of complex queries – here’s an example from Kosinski:
"One of our most surprising findings was that we could even predict whether your parents were divorced or not, based on your Facebook likes. Actually, when I saw those results, I started doubting my methods and I re-ran the analyses a few more times. I couldn't believe that what you like on Facebook could be affected by your parents' divorce, which could have happened many years earlier - we're talking here about people who might be 30 or 40 years old."
The researchers could accurately predict a wide range of things based on your digital footprint, including whether you were a smoker, your drinking habits, whether you were a drug user, your sexual orientation – according to Kosinski, everything was predictable, to a degree, based on this data.
So how did they do it? The secret is in scale.
Let’s say a person likes both Coca Cola and NFL on Facebook, either by directly liking the Page or interacting with related content in some way. That doesn’t tell you anything, right? One person liking two things is not a trend - you can’t conclude that everyone who likes Coke will also be a fan of the NFL. But on a scale of literally billions of users, and trillions of correlating data points, clear patterns do start to take shape. People who like this thing will also like this, at 80% correlative accuracy. People who are smokers share these traits – which are also shared by this group who reported being non-smokers (but actually are).
People who are more likely to vote for this candidate are most concerned about these issues – you can influence how they vote by targeting them with this type of content.
It may not seem like your basic interactions – your Likes, photos, shares – it doesn’t seem like each of those smaller processes can actually mean that much. But they do, or they can, so long as someone has access to the larger dataset.
This is what Cambridge Analytica reportedly had access to, via an academic who gave them the data. As such, despite the insights they had being recorded back in 2014, they’d still be relevant, and indicative today - that data could still be used to influence voters in 2020 with a large degree of accuracy.
Facebook itself has built its business on the back of the effectiveness of its ad targeting, offering ever-more sophisticated and data-driven options – there’s no doubt that such process can be used to manipulate the public.
And now that members of the public – not to mention regulatory authorities – are becoming more aware of this, what does that mean for Zuck and Co. moving forward?
A History of Queries
This, of course, is not the first time Facebook’s in-depth ad targeting has come under scrutiny.
Facebook has faced a range of questions over its data tools and platform policies - and has even come under regulation, in varying capacity, across regions of Europe. The company has access to more personal insight than any other organization has ever had in history, which is why it’s so incredibly valuable to advertisers, but that capacity also comes with an equally high level of risk, as evident by the Cambridge Analytica case.
Thus far, Facebook has been able to shake off such concerns and continue its forward momentum, but the suggestion that such data can be used to influence elections – potentially putting citizens at massive risk – is something that no one can ignore. Sure, you might be able to shake of specific ad targeting – so what if those Nikes you just looked up are now being pitched to you in a sidebar ad? But knowing that your very thoughts and unconscious leanings are being used against you, that’s something else entirely. Based on the data, you could well be making significant decisions, without even realizing it.
Any platform of Facebook’s size and influence is undoubtedly going to run into such issues at some time. But that’s the thing - there is no other platform of Facebook’s influence, and never has been. Given this, The Social Network is in uncharted territory. That’s not to excuse them for the various mishandlings of such concerns - but really, no one could have handled them. No one’s ever been in the situation that Facebook now finds itself.
On the flip side, no regularity authorities have ever been faced with such a challenge either – there’s simply no way of knowing, for sure, just how much influence Facebook has, and of what the company, or anyone who can access it, can do with its data. It’s clearly powerful; the level of insight could clearly impact people’s actions. But could it really be used to alter your thinking on a critical subject? Can political groups really use this data to change the way we think on a large enough scale to shift culture?
If the latest accusations are correct, then it can - though even then, the actual impacts are impossible to quantify. In which case, should we let Facebook off the hook?
The answers to these questions will play a crucial role in the future considerations for The Social Network, and how it can be used, both by everyday advertisers and academics alike (I’ve asked Facebook if the investigation will lead to further regulation of data on their part, but have received no response).
But then the other key question is ‘are we better off, as a society, without Facebook?’
Will this be the beginning of the end for Zuckerberg’s social behemoth?
Switching Off
On this front, the report couldn’t have come at a worse time for the company.
Various reports have already indicated that Facebook usage is in decline for the first time ever, with the platform turning to drastic new measures to try to boost engagement and keep users on site.

These new concerns will give users further reason to re-assess their Facebook habits – if people are spending more time in private messaging channels to avoid public over-sharing, the latest revelations will only help further fuel that movement.
Of course, Facebook the company is largely not losing out, as engagement on their various other platforms (Messenger, WhatsApp and Instagram) continues to increase in-step. But still, their main platform is where they make the most money, where their ad system is best optimized, and where the most potential still lies.
Facebook can also still harvest data from those channels too, but without intrusive ads, they can’t monetize it in the same way, which, from a perceptual standpoint, makes those apps seem less risky.
It could well be that this latest controversy does the Facebook brand permanent damage, and leads to a re-focus on those other apps moving forward – and that could fundamentally change the social media landscape.
But then again, if history shows us anything it’s that humans are habitual - in all likelihood, people won’t change their social media habits unless they’re forced to. If Facebook comes under further scrutiny and regulation, that too could see a wide scale industry shift, and that seems the more likely scenario, as opposed to users leaving in droves.
But definitely, it does feel like public perception is shifting somewhat. Where that could see particularly significant impact is among younger users – while most adults now have a social routine in place, younger users coming in do not, and they might not look to adopt Facebook so readily, given the controversy. That could see Facebook slowly start to tail off - the first sparks of descent.
It seems incredibly premature to be forecasting the death of a platform that generated some $40 billion in revenue in 2017. But it does feel different this time. The questions are too loud to ignore.
As noted, Facebook may well be able to maneuver the minefield and see things go back to normal, but the perceptual shift feels significant.
The company will be forced to explain itself, beyond its announced investigation into data misuse by Cambridge Analytica, and the answers it provides will be crucial.
This new controversy is going to hurt, no doubt, but how much pain Zuck and Co feel will relate to how well they can respond to the rising roar of concern.