Facebook Is Coming to Town: Ho-Ho...No!

Mike Johansson
Mike Johansson Lecturer at Rochester Institute of Technology and principal at Fixitology, RIT and Fixitology

Posted on December 19th 2013

Facebook Is Coming to Town: Ho-Ho...No!

They know when you've been sleeping, 
They know when you're awake. 
They know when you've been good or bad, 
So be good for goodness sake! 

Is Facebook being revealed as a grinch during the holiday season?

Facebook is, in the height of holiday season, being revealed as at least as big a snoop as the Jolly Old Elf or even the NSA (National Security Agency).

In a story on Slate a few days ago it was revealed that Facebook is … analyzing thoughts that we have intentionally chosen not to share.

That’s right: When you start to write something on Facebook, but change your mind and delete it that material does not just disappear. No, Facebook has been scooping it up and analyzing it to study what two FB researchers call "self-censorship."

But what’s to stop Facebook from using all this data for other reasons? For example, to serve us even more highly targeted advertising? That would be a fairly benign result.

As the Slate story points out some people might compare this to the FBI’s ability to turn on a computer webcam without the user’s knowledge to monitor for criminal activity. The difference is that the FBI has to get a warrant for that kind of surveillance. In Facebook’s case no warrant is needed.

The Facebook researchers say that decreasing self-censorship is a goal of the social network because such censorship decreases the quantity of content (and thereby the quantity of researchable data) publicly on the platform.

But the bigger question this brings up is: Can Facebook be trusted? History would tend to suggest it cannot.

Under Facebook’s Data Use Policy, there is a section called "Information we receive and how it is used." This makes clear that the company collects information you choose to share or when you view or otherwise interact with things. But nothing suggests that it collects content you explicitly don’t share.

So what do we, as users, do about this? The likely answer is: Nothing except maybe think twice before typing in anything on Facebook.

Several studies have indicated that any concern about trust may be limited to older users of Facebook.

Data collected by MDG Advertising from the American Consumer Institute Center for Citizen Research, Anonymizer, Harris Interactive, MSNBC and The Ponemon Institute shows that overall "2 out of 3 active online users do not trust" the social media sites they are using. These numbers are based on users of all ages.

Click on the graphic to see the full report and infographic.

Do we trust online sites we use?
 

On the other hand a 2012 survey conducted by YouGov in Britain (and finding similar data to older surveys in the United States and elsewhere) found that the younger users of online services such as a social media site are more likely to trust that online service.

Click on the graphic to see the full report and infographic.

Online trust changes with age


All of which underscores that these latest revelations will make older users of Facebook are more likely to be concerned about privacy and it make very little difference for younger users.

What do you think? Should a social media platform be completely transparent about what information it is looking at and how that information is being used?

Mike Johansson

Mike Johansson

Lecturer at Rochester Institute of Technology and principal at Fixitology, RIT and Fixitology

Mike is a strategist and teacher who helps businesses and students understand and get the most from social media. He currently is a Lecturer in the Department of Communication at the Rochester Institute of Technology where he teaches advertising, public relations and journalism (all with a social media twist). 

See Full Profile >

Comments

This is fascinating, albeit very unsettling. Why do you suppose that Facebook thinks that self-censorship "decreases the quantity of content"? What purpose does snooping on our self-censored posts and comments have for delivering higher quality content? The only thing I see it benefiting is Facebook's research, which is interesting in an academic sense but completely unethical (and illegal?) since it clearly oversteps on users' privacy. Self-censorship has been a part of communication since humans were able to communicate, so while the idea of studying overcensorship is interesting, I don't see how it could help Facebook deliver higher quality content. The lack of explanation by Facebook is extremely unnerving.

Thanks for the comment Betsey. My intention in writing this was mostly to make more people aware that this has happened and could happen in the future. As to the "why" ... from FB's perspective the more datapoints they have on each of us the more targeted they can be with advertising and more accurate in their predictions of likely relevance of messages and other content they serve up to us.

Weird. The post cited for this article says " The Facebook rep I spoke with agreed that the company isn’t collecting the text of self-censored posts." So what is your post about again? This is the real current story in social media--the old fashoned game of telephone. Someone posts something, it is barely read and understood and then regurgitated across the 'net, sometimes even by news organizations.