Facebook really wants – even needs – users to feel comfortable about the data they’re sharing on the platform, in order to quell distrust in the wake of the Cambridge Analytica scandal. As such, and using the latest GDPR changes as an impetus, starting this week, all Facebook users in all regions will be shown an alert when they visit the News Feed which will prompt them to review details about the information they’re making available on The Social Network.
As explained by Facebook:
“Over the coming weeks, we’ll show people a customized message that puts the following information in front of them:
- How we use data from partners to show more relevant advertising
- Political, religious, and relationship information they’ve chosen to include on their profiles
- How we use face recognition, including for features that help protect your privacy
- Updates to our terms of service and data policy that we announced in April”
As noted, the messaging is a requirement of the new GDPR changes, but Facebook has opted to share the tool with all users as a means to increase trust. It’s a logical move from Facebook, and a good reminder that people are in control of their data - but as we know from previous research, a significant number of users likely won’t bother to change anything, regardless of the broader concerns.
Part of the problem is that it’s hard to contextualize what it means to be sharing your data in this way. Do I care if Facebook is using facial recognition to detect other images of me posted to the platform? No, I don’t, because I don’t have anything to hide, and there’s some practical value in Facebook alerting me to pictures I might not have been tagged in. But that insight is then added to their database, which also tracks who I’m in photos with, where I am, what products may be in the same image, what I’m doing. Just using this basic example, there’s any number of ways that such insights can be used to build a better, contextual profile of who I am and what I’m about.
Now again, that doesn’t mean much – let’s say Facebook were to work out from images that I regularly attend local library events, that I drink Coca-Cola, that I hang out with these same three friends.
That’s nothing, right? There’s not a lot Facebook can glean from that.
But that’s where you’re wrong. In isolation these things don’t mean much, but when matched against a massive database – say, 2.2 billion other profiles – those little details can become highly indicative. People who attend library events and drink Coke will have a listing of commonalities, maybe even political leanings, which Facebook can then delineate, but then you match that further against my on-platform activity, the Pages I’ve liked, the comments I’ve made, the people I’m in the photos with and the things they Like, etc. Pretty soon, you’ll have a fairly narrow subset of users who match the same profile - and those users will have similar psychological and political leanings, which Facebook can then use to form accurate persona profiles, and reach me with more effective ads.
This is how politically motivated groups like Cambridge Analytica have reportedly used Facebook data, and given this, it’s not so much about awareness that you’re sharing such insights, it’s how, exactly, those details can be used to build your personality profile.
It’s hard for Facebook to explain this, and in many ways, the best they can do is prompt users to review their settings. But without understanding of what sharing such data actually means, there’s likely little motivation for most users to take the time to review such.
As noted, this has always been the way – back in 2012, reports showed that around 13 million Facebook users had never changed their privacy settings at all, while more recently, in the wake of the Cambridge Analytica reports, Facebook noted that it had seen no “wild changes in behavior with people saying I'm not going to share any data with Facebook anymore”.
Without the context of what it means to do so, people will likely leave things as they are. Even in the wake of data-abuse reports, even amidst suggestions that your very thoughts are being manipulated, based on psychological profiling.
Here’s an interesting note – back in 2015, I spoke to Dr. Michal Kosinski, a Stanford University professor who lead one of the now infamous psychological studies of Facebook users, through the usage of a personality quiz app (important to note, this is not the study reportedly used by Cambridge Analytica).
In our interview, I asked Kosinski if what he’d discovered about the depths of what Facebook usage data can reveal about him would make him think twice about using Facebook, or sharing his personal information.
"Not at all, there are too many benefits that I'd deprive myself of […], you can't function in today's world without leaving behind significant amounts of digital footprint."
That is, indeed, the case – we’re sharing more and more of our personal data, in many forms, and while Facebook is logically taking the brunt of criticism, there are a lot of other ways your data can be tracked.
A sobering thought, and one which will require more than just changes to your Facebook settings.