Content Discovery Smackdown: Hootsuite vs. Buffer vs. KloutContent Marketing Minds: Ingredients of the Tastiest Content [Nutrition Label]From the Corn Field to the Digital Era: Content Marketing Starts with TrustContent Marketing: Is 2014 Really Shaping Up to Be the Year of Video?
Your Customers Aren’t Listening! How to Create Consumer Dialogue that Converts4 Tools for Nonprofit Social Listening and Reputation ManagementThe Promising Role of Social Listening in Treating Health IssuesThe Importance of Social Listening for Brands
- Public Relations
Facebook Testing a Way for Users to Buy Products on the Platform7 Website Tips to Attract More Shoppers to Your PagesHow eCommerce, Augmented and Virtual Reality Will Redefine the Retail ExperienceSearch Query Analysis to Increase eCommerce Website Conversions
- Content Marketing
Technology & Data
Social Startups: Bizible Connects All the Dots from Marketing Contributions to RevenueCreating the Perfect Profile for Your Social Media Marketing EffortUsing GPS and Localization for Social AnalyticsAnalytics and Prospect Intel: Discovering Your Ideal Prospect
- Big Data
- Tech & Innovation
3 Security Risks You’re Taking Every Day While Using Social MediaShould the President Have the Power to "Pull the Plug" on the Internet?How Safe is Your WordPress Website From Hackers and Other Malicious Attacks?
- Software & Tools
- Small Business
- Social Organization
Celebrating the Grand Re-Launch of Social Media Today! SBH Podcast Episode 8Why Should You Care If Your Employees Are Thought Leaders?Beyond Engagement: The Art of Managing Social-Media Risk in Employee Advocacy
Why All-in-One Social Media Management Systems Don't Cut It for Social Customer ServiceWhat You Should Know About Customer, Digital, and Contextual ExperienceSurging into Q3: How to Make It Better Than Q2Is How You Serve Your Customers Costing You Business?
Join us September 15th in Atlanta for The Employee Advocacy Summit and learn how to unleash the power of your employees.
Post your event here and we'll share it with our community. If one of our members is featured, we'll promote as well on their profile.
- Marketplace & Webinars
The SMT Marketplace
Your resource for exclusive content and insights from Social Media Today, and opportunities to reach our community of professionals.
The Social Business Book Club brings you books, discussions, and insights from today's to business thought leaders.
Join interactive talks and and panel discussions with leading thinkers and practitioners on social media and networked business, or browse the catalogue of recorded sessions - all completely free.
Reach Social Media Today's community of marketing and communications professionals in an editor-approved context with a native advertising package.
Facebook's Been Running Psychological Experiments On You
Posted on July 3rd 2014
Felt a little lower than usual, or happier than usual for a week in January 2012? It might have been because of an "experiment" Facebook conducted on you.
Facebook identified 689,003 English speaking users to run a psychological experiment on, for the duration of a week. They began to manipulate the newsfeed of a group of these users to remove posts with a negative emotion attached to them, and removed all posts with a positive emotion for the other group. The objective of the study - can we be emotionally influenced by what we see in our Facebook newsfeed?
And if so, how much?
The experiment has caused a huge amount of uproar on... uhh, Facebook. There's post after post of people feeling that they're being treated like guinea pigs and that manipulating someone's mood is incredibly dangerous. These are all legit concerns. Someone battling depression, or worse, on the brink of suicide, being exposed to increasingly "negative" posts on Facebook for an entire week might be persuaded to do something that they normally wouldn't.
Is what Facebook did ethical? Probably not.
Is what Facebook did legal? Absolutely.
Facebook's terms of service, which every user agrees to, gives them a lot of wiggle-room in terms of what they can or cannot do with our data, and the kind of information and updates that we see on Facebook.
In fact - this is true for most (if not all) social networks. LinkedIn, Twitter, Google+, Tumblr, Pinterest - all of these websites are designed and engineered to influence us to click more, engage more and interact more with them. The nature of their algorithms is never revealed, but one thing is always made clear - they're doing all they can to give us as much relevant content as possible.
This practice however, of showing one group one piece of content and another group another, isn't new or ground-breaking. Marketers call this practice of split testing content, A/B testing. Showing a group of people one piece of content and analyzing what emotion it elicits, and showing another group another piece of content and seeing what kind of emotion or action that elicits.
It's primarily used to drive higher conversions to sales, higher engagement, more sign-ups and so on and so forth.
Stepping outside the realm of social networks, websites such as Amazon run a huge amount of A/B tests to figure out what influences users to buy more. To extrapolate that - you could say that they're testing out what we emotionally respond to, in order to get us to buy more. Other websites run A/B tests all the time to figure out what landing page will trigger more conversions, what graphic will get more clicks and shares, and what line of text will influence engagement.
Websites such as BuzzFeed and Upworthy A/B test their headlines to drive more click-throughs. They're designed to draw out emotions from readers that prompt them to share the article, and generate more "buzz" for BuzzFeed. That in a way in itself, is a manipulation of your emotions, isn't it?
The truth of the matter is, that websites A/B test all the time. For marketers, this has been common knowledge as the practice of A/B testing has become incredibly common in a digital world when it's so easy to conduct such a test using a variety of tools that are available to them.
The only difference being, that Facebook actually came out and talked about this A/B test. Most other companies don't, and never have. It tends to be closely guarded information, to not let leak what companies have learnt through A/B testing about their users, because why would they want to let competitors gain insight from experiments that they've run.
You're not paying to use Facebook or Twitter. When you're not paying for the product - you are the product. Your information, your memories, your experiences, that's what you pay with each time you use a social network.
All that set aside however, this does raise a significant amount of questions. If by manipulating the content you see in your newsfeed Facebook is able to influence your thinking, and as an extrapolation, your actions - what is it possible to do, by manipulating the newsfeed of users?
Could a potential political candidate that's backed by a network like Facebook essentially be able to get more votes? Ask any dictator, any ruler, what one wish they'd like to be granted - and they'd undoubtedly wish for the power to be able to influence the mood and emotion of their people.
Has Facebook essentially, by taking a group of close to 700,000 - proved that if push comes to shove, sway the opinion of the 1.3+ billion people that use the service?
Emotional engineering is, and always has been, Facebook’s business model.
Two thoughts are playing in my mind about this entire debacle. The first, why would Facebook go public with such information? Why would they openly talk about such an obviously unethical experiment that proves that they're able to influence the mood of a large group of people? Why would they want this known?
And the second, which is far more unsettling, is that all said and done - we aren't trulysurprised that Facebook did this. We've come to expect it of them to blur the line between right and wrong, and for us to just shrug, shake our head, and carry on.