Have you seen and heard enough about Trump vs Clinton by now? It's a critically important issue, no doubt, it's something all Americans, and indeed all citizens of the world, need to have at least some awareness of. But we're going on 16 months now since Trump announced his candidacy, the campaign is a marathon and it can feel exhausting in just the same way.
If you are feeling political fatigue, you're not alone - a new study conducted by Pew Research has found that 37% of social media users report being worn out by the campaign, in comparison to only 20% who say they like seeing lots of seeing lots of political information. Pew's data, based on responses from 4,500 US adults, also found that 59% of people describe their online interactions with people they disagree with, in a political sense, as "stressful and frustrating," while 64% say their online discussions with those they disagree with leave them feeling "as if they have less in common than they thought."
The numbers reflect two significant elements of the modern political process, and the part social media plays within it.
The first is the overwhelming flood of information we have in our modern, connected world. The famous stat commonly thrown around is that 90% of the world's data has been created in the last two years, with much of that information coming from social media. That influx of content can be overwhelming at times, which may be what's lead to so many people feeling worn out by the constant bombardment of political content in this election cycle.
But the other, more important, element reflected in Pew's data is the growing role social is playing in not only how we receive information, but which messages are actually getting through. And that may have far more significant implications in our political decision making process.
The US Presidential campaign has been one of the more divisive and hotly debated political races in some time. On one hand, you have Hillary Clinton, who represents the political establishment as it stands, and has stood for some time. On the other, you have Donald Trump, who represents a shake up, a challenge to the status quo for those dissatisfied with the way the Washington machine has been running things. "Drain the swamp" is one of Trump's favorite descriptions of the problem, in reference to flipping the current political system on its head.
The two approaches, in a basic sense, come down to a question of whether you want to keep things the way they are or you want change. Keeping things the way they are, with Clinton, implies a level of safety and comfort, whereas Trump, from an objective standpoint, represents more risk (I realize this is simplifying things quite a bit, as there are many other perspectives to consider, but in a broad sense). And while many struggle to understand why people would be willing to take such a risk, it's interesting to note the power of social media in the modern political process, and how more revolutionary movements - like Trump, like Brexit - have been able to gain more traction than in times past because of the structure of the modern news cycle, and the inputs through which we receive relevant information.
For example - in another section of the Pew study, researchers found that 83% of Americans try to avoid posts from friends they disagree with politically, with 39% saying that they've blocked or unfriended someone in order to avoid opposing political views.
That's a concern, right? In a modern, democratic society, with more access to more information and insight than ever before, we should be evolving our way of thinking and be more open to others' views. Right? The data shows that this is simply not the case.
This exact movement is what raises concerns about the "echo chamber" effect of social media networks, that social platforms, with their personalized algorithms and user-defined news feeds, lead to people having a more narrow view of the world, despite having access to a much broader scope of information. The ideal, philosophical outcome of a more connected society is that we become exposed to more perspectives, more opinions - we develop better understanding because we're more connected through these new communication options. But there are many examples - especially in the political sphere - of the opposite being true.
By building our own media inputs, based on our indicated preferences and selections, we may actually be creating a system that simply re-enforces our own bias, as opposed to providing us with additional perspective.
From a consumption standpoint, that makes sense - if we're served more content we're interested in, we're more likely to spend more of our time on that platform, which obviously benefits the platforms themselves. But when considering the benefits of shared perspective, of greater understanding of situations outside of our own, that process may actually have a negative impact, and inadvertently lead to greater societal division.
The potential problems with the echo chamber effect gained more focus recently with Facebook's 'Trending News' controversy - essentially, Facebook was accused of tampering with their Trending News listings in order to prioritize, or de-emphasize, news content as they saw fit.
Such actions can have significant consequences - you might think that such efforts would have little impact on you personally, but the fact of the matter is that more Americans than ever are now getting their news via Facebook.
If a story were trending on Facebook that wasn't actually trending - say something like a political movement - that could lead to that movement becoming a much bigger issue, and a much more divisive debate than it otherwise would have been.
This is what Facebook's been accused of with the #BlackLivesMatter movement - an important debate, no doubt, an issue worthy of genuine discussion and action. But Facebook's trending news team has been accused of manually inserting the topic into the trending news feed in order to artificially boost its focus. The intention, on Facebook's part, may have been well intentioned - this is an important debate, and as such, it's important they expose the Facebook audience to it. But such intervention can significantly alter the course of the discussion. Maybe it widens awareness of the crucial issues by exposing more people to the issue. Or maybe, it actually creates more of a divide.
In this instance, let's say that more people become aware of the topic because Facebook has manually pushed it out to more users. That, in turn, leads to more people posting their opinions about it, more people putting forward their perspective and essentially taking sides, stating what they believe one way or another - which is really what social media is designed to facilitate. That's great for Facebook - more people posting more often means more people spending more time on the platform - but because Facebook's algorithm aims to show people more content similar to what they like, maybe that sees people on each side of the debate exposed to an increasing amount of content that simply reinforces their opinion, as opposed to furthering the discussion. Maybe those people unfollow opposing voices, block those who disagree with them. Maybe now, Facebook, while intending to generate more rational debate and understanding, has actually reinforced the dividing lines of the controversy and pushed more people to take definitive sides.
This is a hypothetical, of course, how that actual process played out is impossible to know without analyzing the full gamut of Facebook posts about a specific issue. But you can see how Facebook's tailored, algorithm-defined News Feed, which is built to highlight more content relevant to each users' individual interests, can actually become confirmation bias in practice.
Ever since the Trending News controversy, Facebook's gone to great efforts to explain how they've changed their systems so that such tampering can never happen again (if it ever did), but even if you remove the manual tampering factor from the process, the potential for distortion and amplification still exists.
Of course, you could argue that such bias has always been present in some form - you listen to a certain radio station or read a certain newspaper and you're subject to their editorial bias, implicit or explicit. But Facebook's algorithm defined process takes this to another level. Now, it's not just the media outlets you select, but that curation can extend to your personal interactions. Whereas you might be exposed to alternate perspectives by, say, going to dinner with a friend, now, through Facebook, you'll likely know their political leanings ahead of time. And you can mute them, shut them off. You can even stop talking to someone in real life because you don't agree with what they think. Pew's numbers actually suggest that may be what's happening.
And now Instagram uses a similar algorithm-defined process. So does Twitter (to a lesser degree). So does LinkedIn. Even Pinterest is taking an active role in getting people to be more politically aware.
Content more tailored to your interests, as noted, is great for engagement - but is it beneficial to the process of fueling political awareness and debate?
And should social networks care about, or feel in any way responsible for such impacts, either way?
Cause and Effect
It's a challenging issue, and one on which there's still much research and debate to be had. But the fact of the matter is that social media can, absolutely, impact on how we vote, on the decisions we make on politicians and political movements.
Back in 2010, around 340,000 extra voters turned out to take part in the US Congressional elections because of a single election-day Facebook message. Facebook, in the past, has also proven that it can manipulate the emotional state of users by showing or restricting content types in their feeds.
Social media, however you want to look at it, is one of the most influential media sources in the world today, it's far more than a time-waster or a trend amongst teen users. This is why there's such interest in the debate over whether Facebook is a media company or not, why there's such concern about issues like Facebook's Trending News controversy. Because the implications are significant, likely more significant than we even realize.
The decisions we make shape the world we live in. Arguably, there's no investigation more important than that which helps us understand the dissemination of ideas and how our thoughts are influenced.