No, everybody isn't in an "echo chamber" on social media
That doesn't mean it isn't really bad
Thanks for reading America Explained! Paid subscriptions are what keeps this newsletter a going concern, so please upgrade if you’re able to spare a few dollars or euros or whatever (I’m not picky!) a month to support independent journalism and to access all of our posts. And as always, students and educators can get a full subscription for free - just drop me a line.
Recently I’ve been occupied with writing an undergraduate U.S. politics textbook. It mainly involves trying to present well known material in an engaging way. But it also lets me delve into areas of research I’m not familiar with, and sometimes I find out really interesting things.
This happened while I was writing a section on social media and politics. I went into it taking for granted the idea that social media has split everyone into echo chambers where they constantly have their own views reinforced. But what I discovered was a decade of research telling a more nuanced story.
Researchers in this field distinguish between echo chambers and the more strangely named “filter bubbles”. An echo chamber is what happens when we pick our sources of information according to our tastes. A filter bubble, by contrast, is what happens when the algorithm makes the choice for us. For instance, YouTube might know I’m interested in animal rights and then start showing me more and more slaughterhouse footage.
Because of their potential to make political divisions worse, political scientists have spent a lot of time investigating these phenomena. What they’ve found is that for most people, they don’t exist.
That doesn’t mean that social media doesn’t have dangerous effects on political discourse. But it might suggest that we’ve been focused on the wrong things.
Let’s start with echo chambers and filter bubbles. Research shows that only a small number of people inhabit an echo chamber on social media - less than 10% of the U.S. population. And even that figure exaggerates the problem, because we live in an intensely multi-platform world and almost nobody only gets political information from social media.
Someone might have a politically monochrome Twitter feed but still be friends on Facebook with family members who share opposing views. They will still catch snatches of local TV or radio. It’s just very hard to construct a real echo chamber today.
Filter bubbles turn out to be similarly hard to find. The original panic around filter bubbles was focused on things like search engines and news websites - mainstream tools used by large numbers of people. But these sites - like Google News for example - actually do remarkably little algorithmic shaping of results. In fact, if anything, the problem with Google News is that it’s too unresponsive to user tastes, tending to just serve up the same big mainstream news sources all of the time when it could be directing us to smaller independent sources (like this newsletter!).
Similarly, on social media, filter bubbles are not so strong as public discourse tends to assume. Most people don’t just relentlessly click on the same type of political content, and people on both sides of the political aisle frequently share opposing viewpoints. To be sure, when they do this it’s generally just to mock them - but to the algorithm, it’s a share all the same. This behavior means that few people end up algorithmically filtered into bubbles.
That’s the good news. Now for the bad news.
The first bit of bad news is that although echo chambers and filter bubbles on social media might not exist for the majority of the population, they are a reality for some. And the people for whom they are a reality tend to be people who are most engaged in politics.
These are the people who are most likely to vote in a primary, write to their member of Congress, or share a political post on social media. They set the tone of discourse, especially when you consider that many members of the political elite - politicians and their staff - are precisely the sort of people who inhabit these types of bubbles.
The second bit of bad news is that none of this means that social media is not bad for politics. It just means that the specific idea that it’s bad because it puts everyone into chambers and bubbles doesn’t really stand up to scrutiny. But what is still true is that social media has an enormous tendency to turn political discourse negative because of the types of posts that it encourages.
Research has shown that social media posts which are emotionally charged and carry a negative framing of some out-group do much better than dryly factual or positive contributions. And this type of post is common precisely because we do not live in echo chambers - we are routinely exposed to the ideas of “the other side” on social media and we often react with horror, anger, and disgust.
The third bit of bad news is more fundamental. The promise of the internet was always that by allowing us to get to know one another better, the opportunity for mutual understanding and cooperation would increase. But what if the real problem isn’t that social media separates us, but rather that it exposes us to one another more consistently and often, and that the result is a constant barrage of negativity and misunderstanding based on our differences?
Algorithmic tweaking might help with that problem some, but its roots lie much deeper - in political and social divisions that predate the internet and which we can’t blame on social media alone.
Thanks for reading America Explained! Paid subscriptions are what keeps this newsletter a going concern, so please upgrade if you’re able to spare a few dollars or euros or whatever (I’m not picky!) a month to support independent journalism and to access all of our posts. And as always, students and educators can get a full subscription for free - just drop me a line.
In this post, I drew on research which you can find summarized and explained well in Are Filter Bubbles Real? by Alex Bruns and Echo Chambers, Filter Bubbles, and Polarisation: A Literature Review by Amy Ross Arguedas, Craig T. Robertson, Richard Fletcher, and Rasmus K. Nielsen.

