With all the discussion around how social platforms can be used to fuel divisions within society by facilitating the spread of fake news, and reinforcing filter bubbles due to algorithms showing users more of what they agree with, the latest Pew Research study on social media news consumption comes at an important time, and underlines, yet again, why this is a significant area of concern.
According to Pew's latest data, which incorporates responses from more than 4,500 Americans, 68% of American adults say that they now get at least some news coverage from social media, making it an influential source of updates on the latest issues and affairs.

It is worth noting that the percentage of people getting news from social platforms is largely the same as Pew's 2017 report on the same (67%). But still, that's a significant amount of influence - news and issues which are shared on social platforms are reaching large audiences, and are able to impact public opinion, and importantly, voter behavior.
That's little surprise considering all the recent coverage we've seen around Russian political groups seeking to influence American voters via social platforms, but this data shows why. And when you add to this the fact that users are more likely to comment on negative, divisive news stories (see point two below), another indicator to algorithms to increase a post's distribution, you can see how social platforms may well be helping to fuel societal division. And there may be no easy answer on how to stop it.

But while users continue to turn to social platforms as a source of news content, Pew's study also shows that they're becoming more skeptical about the information presented. As noted at the bottom of the above chart, 57% of social media news consumers now also believe the information that they see on social media is 'largely inaccurate'.
That then raises the question - why are people getting news from social when they believe it's not true? Is that a problem within itself?
According to Pew's research:
"...most social media news consumers say getting news this way has made little difference in their understanding of current events, and more say it has helped than confused them (36% compared with 15%)."
So the information may be inaccurate, but it's still helping to provide more of an understanding of the key issues. It may be taking a leap, but I'd read that as 'the news I see reinforces my established beliefs, so it's helpful, even if not 100% correct'.
If this is indicative, that's a concerning trend. The insights here suggest that people know that the reports they're reading on social are not right, but they're still using them to form opinions anyway. Maybe, part of this is driven by the 'kernel of truth' theory, that even if a report isn't 100% right, there must be some element of truth in it, and that's enough to help guide opinion. Maybe - it's difficult to say, exactly, what this finding means.
In terms of which platforms users turn to for news content, Facebook still leads by a significant margin - more than double it's closest competitor (YouTube).

That's little surprise, given The Social Network has more than 2.2 billion active users, but its also the platform more heavily influenced by algorithmic sorting. The problem with that is that the algorithm shows you more of what you interact with, from the users you interact with, and based on the subjects you like. And that can lead to increased polarization.
You can see this in The Washington Post’s highly cited ‘Red Feed, Blue Feed’ experiment, which shows how supporters on either side of US politics are further fueled by Facebook’s algorithm, feeding them more and more content that aligns with their view.

Looking at that experiment, it's easy to see how supporters on either side are being further dragged towards each perspective by the algorithm's determination.
Essentially, the algorithm works well for sharing generic topics - if you like dogs on Instagram, your Explore feed will be filled with more examples of dog pictures. That's helpful, as it aligns with your interests, but it's potentially damaging in terms of political content. If you're shown more insights from one side of politics, and less - if any - from the other, you become more indoctrinated into a siloed set of views.
Why would you assume that there's anything else if its all you're seeing? And even if it's 'fake news', if you know that not everything you read on social is 100% correct, if the algorithm is showing you more and more content from users and Pages which reinforce a single view, you can't help but be more aligned with that perspective.
In terms of platform specifics, as you would expect, Facebook is a bigger source for older internet users, while Snapchat and Instagram are key for younger audiences.

There's a heap more in the full Pew Research report, reinforcing the complexity of our relationship, and reliance, on social media, and how that can be problematic when it comes to news consumption. There are various angles to consider, and no easy answers, but the data here shows that social media remains a key news source, even amid concerns about accuracy (note: the research was conducted in July and August this year).
That may be the biggest concern of the entire report.