Facebook has released an update into its app data usage investigation, which was triggered after revelations that Facebook user data was used by political activist firm Cambridge Analytica to fuel its voter insights database.
The investigation was launched back in March, with Facebook vowing to investigate all apps which had access to large amounts of user information prior to the platform changing its access policies in 2014, and limiting what third parties could access. Before 2014, a wide range of apps could effectively get all user data permissions, giving outside companies the capacity to access people’s personal information, and potentially use that in nefarious ways.
Thus far, Facebook says, it's investigated thousands of apps, with more than 400 providers suspended due to:
“concerns around the developers who built them or how the information people chose to share with the app may have been used”
That’s good and bad. The good is that Facebook has detected these potential breaches and removed the capacity for these providers to steal Facebook user data in future. The bad is that it’s likely too late – if a company was going to utilize such information, they’ve been able to do so for almost four years, and there’s no way for Facebook to know whether they’ve on-sold that data or not, no matter how many ‘forensic audits’ or anything else they undertake.
Interestingly, Facebook has also made a specific note about one app in its announcement:
“Today we banned myPersonality - an app that was mainly active prior to 2012 - from Facebook for failing to agree to our request to audit and because it’s clear that they shared information with researchers as well as companies with only limited protections in place. As a result we will notify the roughly 4 million people who chose to share their Facebook information with myPersonality that it may have been misused.”
Personality quizzes like this are the most common type of app which research companies have used to glean personal information. Back in 2015, researchers from The University of Cambridge and Stanford University used a similar personality quiz to gain access to insights from more than 86,000 Facebook users, which then fueled their report into how your Facebook activity can reveal your psychological leanings more accurately than friends, family and partners.
The fact that so many users may have had their Facebook data accessed in this way is no doubt why Facebook has highlighted this one app, in particular - but it’s also likely a move to reassure users that they are acting on these types of tools, the ones most known to have been used in this capacity.
But it likely doesn’t matter. I mean, it does, in terms of Facebook doing all they can to fix their systems, and Facebook’s new app policies will ensure that none of your information is shared with apps if you haven’t used them in 90 days.
But once that data is out, it’s out – and if a company had access to your information back then, it’s still likely relevant now. You may like different things on Facebook, different Pages, but any organization that’s looking to match your psychological leanings with Facebook data can probably already do so, even if Facebook cuts them off in future.
Basically, the actions here will protect future users from the same type of breach, but those who have been affected will remain so, because such insights will be indicative, even if they’re older.
That’s not to say there’s anything much more Facebook can do about it, but the retrospective app review itself is more an exercise in false reassurance than actual, functional enforcement.