Over the last couple of years, Facebook has come under intense scrutiny over how much data it collects on its users, and how it re-purposes that information, primarily for advertising. But what, really, does Facebook actually know about you, and how can such info be used to, potentially, influence your thinking and consideration? Is that even possible?
This a core issue at the center of the current Facebook data debate - while more and more reports emerge which demonstrate the depth of information Facebook can collect, the next stage is less clear, there's less context for users to understand the potential impacts of such. So what if Facebook logs data on the Pages you visit, that likely just means you'll see more relevant ads, right? So what if Facebook tracks your usage habits to show you more content you'll like?
In a general sense, for the common person, this doesn't mean anything, which is why Facebook usage hasn't appeared to change in any significant way despite such reports.
But it's that context that people really need - users need to have a greater understanding of the processes at play, and how Facebook's systems are able to do more than just show you ads for products you might want to buy. On further inspection, they can actually shape your opinions, without you even knowing it.
And that is critically important to note.
As has been well documented, Facebook uses a vast array of tools to track your interests, habits and preferences - both on Facebook's platforms and off them.
On the platforms themselves (Facebook, Instagram, WhatsApp, Messenger), Facebook logs every action you take, every post you like, every Page and profile you visit. Those insights are also combined with your profile data, your location info - every single thing you enter or do on Facebook's apps contributes a little more to your personal profile, building out a broader data log of who you are, what you like, etc.
And that profile can be very accurate - the most highly-cited research report on this was conducted by experts from The University of Cambridge and Stanford University back in 2015, in which they examined the Facebook profiles of more than 86,000 participants, and then matched their on-platform data against their psychological profiles, which those users had submitted through a 'personality test' app.
Their key finding? Your Facebook activity data alone could indicate your psychological make-up more accurately than your friends, your family - better even than your partner, given enough info.
As you can see in this chart, the accuracy of the model's predictions increases based on how many things a person has 'liked' on Facebook, giving the system more information to measure. That study was conducted around five years ago, so you can only imagine the same model would be even more accurate today.
The implications of this are significant - as highlighted by Cambridge Analytica, which utilized a very similar process in its data-gathering efforts, once you have a measure of people's leanings, you can also use that against them.
Former Cambridge Analytica employee Christopher Wylie explained:
"We would know what kinds of messaging [users] would be susceptible to, including the framing of it, the topics, the content, the tone, whether it's scary or not, that kind of thing. So, what you would be susceptible to, and where you're going to consume that. And then how many times do we need to touch you with that in order to change how you think about something.'
This is information warfare at the highest level - and when you also consider that around two-thirds of American adults (68%) now get at least some of their daily news content from social media, with Facebook being the prime social media news source by a big margin, that's a big concern.
It's not just the ads you see, but the information you're shown - and while Facebook is now working to address this in various ways, you can see how the additional context of what such insights can be used for is significant.
And it's not just on Facebook itself that's a concern.
Facebook also tracks people when they're not on Facebook, and those who've never even signed up. And worth noting, Facebook's not alone in this, Google, for example, tracks similar data, but Facebook's insights have seen more misuse, and the pure size and usage of its platforms make it a more viable candidate for such actions.
In terms of tracking people off Facebook, researchers Frederike Kaltheuner and Christopher Weatherhead recently outlined the various ways in which Google and Facebook track Android users through the use of pixels which are designed to help advertisers gather data on app usage.
Advertisers may use these data tracking tools in their systems in order to track who's using their apps, and that data is taken in and stored, even if those people are not active users of either Google's or Facebook's tools.
"...the vast majority of apps share data the second they're opened, and the data that's being transmitted indicates what kinds of apps you use, when you use them, combined with a unique ad ID. And knowing what kinds of apps somebody uses, and when, can give quite a detailed picture of someone's life."
Kaltheuner provided an example using just four highly downloaded apps - 'Qibla Connect', which is a Muslim prayer app, 'Period Tracker Clue' which tracks menstruation cycles, job search app 'Indeed', and the kids app 'Talking Tom' (each of these apps has been downloaded at least 10 million times).
"That looks like a person who is likely Muslim, likely female, likely looking for a job and who likely has a child."
So even if you're not an active user of these specific platforms, they know a lot about you.
Another recent study further underlines this capacity - a combined academic team from the University of Vermont and the University of Adelaide has found that even if you're not a user of a social media platform, it's possible to create a 95% accurate profile of you, based on your friends' accounts.
"[The research team] found it was able to predict the content of a person's tweets using data collected from just eight of their contacts, and to do so as accurately as if they were looking at that person's own Twitter feed."
From this, the team could accurately predict a person's political leanings, favorite products, religious beliefs - all without these users ever even participating in social media themselves. And this is without the advanced psychological profiling used by the groups noted earlier in this post.
Not only is social media data collection concerning, it's also seemingly inescapable. And it'll likely continue to be used for ill-purpose for some time yet.
This needs to be a key issue of focus in the connected age, a core debate that has to be had. It may not seem immediately harmful, it may not change much in your day-to-day life. But the understanding that your perception of the world - especially in a political sense - could be largely manipulated is surely of relevant concern.
And that element is largely beyond debate - while there's no way of knowing exactly what impact the work of Cambridge Analytica had on the eventual outcome of the 2016 US Presidential Election, previous research, conducted by Facebook itself, has shown that on-platform efforts can influence election outcomes.
According to a Facebook study published in 2010, a single election-day Facebook message resulted in around 340,000 extra voters turning out to take part in the US Congressional elections that year. The experiment used two different types of News Feed prompts - one included a link to information about local polling places and a counter showing the total Facebook users who'd voted. The other showed the same, but with the addition of images of a users' personal connections who'd participated.
People who saw the second message were increasingly likely to vote themselves, resulting in a big increase in voter turnout. The scale of the test was limited, with only a small portion of Facebook users seeing the prompt. But it shows, according to Facebook's own research, that the platform can influence political outcomes.
It may not feel significant, what you like on Facebook, what you share. The content that you see and comment on within your feed. But it actually is. That's why Facebook has sought to focus on showing you more content from friends and family, because that content is, theoretically, less manipulative than the material shared on potentially biased political Pages and the like.
Because now, given the coverage of the recent political events, every bad actor knows that Facebook can be utilized for such purpose, and all of them are considering how they can do the same.
Will that make you use Facebook less? Probably not. Again, the stats show that Facebook usage has remained stable amid the various reports, people are either unconcerned, or Facebook is simply too ingrained in their interactive process to give up.
But even if you think that you're aware, that your political beliefs are solid, that your understanding of certain issues is correct, just take a moment to question the logic behind what you see. Research a little, dig a bit deeper. Use Facebook's new tools, like its Page info data option, which shows where Pages are being managed from, the Page's history and the ads it's running.
Controlling how our data is used, and misused, may now be beyond our control, but we can all ensure that misinformation is further limited by questioning what we see, and reporting what's not right.
And given the stakes, such efforts may be crucial to maintaining democracy as we know it.