how can we deal with an echo chamber web?
We all know someone who only listens to what she wants to hear, the devoted Fox News or MSNBC viewer, a dedicated Red State or Daily Kos reader, and the periodic fire breathing comment section dragon armed with the latest string of talking points and partisan accusations. Nowadays, we’re not restricted to the same highly regulated news channels and papers. We have thousands of channels, hundreds of major blogs, and entire ecosystems of news sites. You would think that if anything, we’d be exposed to something different virtually all the time and have much more diverse viewing and reading lists, right? After all, this was the thinking behind a repeal of the Fairness Doctrine which was an FCC mandated measure to give equal time to opposing views, intended to curb propagandizing on behalf of political candidates when there were just five or six channels on everyone’s TV and people got their news from just a dozen papers. But as it turns out, we use the exact set of technologies meant to expose us to more ideas to throttle the torrent of content to what we find palatable.
While one could certainly argue that there’s great diversity of views across the web and point to sites ran by a myriad of corporations offering mainstream news reports, to forums curated by those who believe that we are all unwitting subjects of sinister Satanic aliens and demons, we can also make the argument that as the web grew and the the initial torrent of content online grew into a tsunami, it began to be corralled into cozy, uniform echo chambers connected to each other through a shared ideology. A big part of the reason why is that we’re dealing with too much data to process. Do you really read every Facebook update in your feed? Can you really look through all the 1,100+ sources Google News gives you for a top story? At some point you have to narrow the information coming to you into a manageable stream. And that’s when political biases begin to play a very significant role. Software doesn’t really care whether you’re an open-minded moderate or a partisan zealot, its only concern is to make sure that it brings you exactly what you want and nothing else. Likewise, Google and any other company offering to filter the web for you also have little care about how diverse your worldviews are and simply want to offer ads specifically customized to appeal to people with your exact preferences.
So here’s the question. Should these companies start caring? According to one paper, offering a few articles in a custom-filtered news stream does prompt some people to read something new which helps them form a more nuanced and well-rounded idea of the subject matter, exactly what one would expect after someone has read multiple viewpoints on the same issue. That doesn’t mean that the subject changed his or her opinions, just that new points were considered and factored into the thought process. Considering the furious, foaming at the mouth partisan rants across far too many sites nowadays, that alone sounds like a big step towards a more civil public discourse that leads to a small emphasis on partisan loyalty and dogmatism. However, this paper’s data set was gathered from 140 college students and the topic in question was an abstract one, with most subjects indicating they knew very little about it. Had the topic been something that hits partisan frictions rather than the transhumanist-sounding “neuro-enhancement,” the results would’ve been more applicable to the context where these preference-inconsistent recommendations would matter most. We don’t know if we’d really be able to get the same students to read an article opposing their ideological stance and show at least recognition of its points, if not an outright appreciation and discussion of the opposing arguments.
So may all come down to whether you find people willing to get out of their comfort zone every once in a while, and how much their identity with a certain movement means to them. If they prize conformity and believe that a new idea is a threat rather than an opportunity to see what others thing, well-meaning inconsistencies in their filtered lists of search results and news feeds will be treated as a nuisance and ignored. If they are fine with a periodic exploration of divergent opinions, they’ll be willing to click on ideologically inconsistent matches every once in a while. Again, the goal here would be just to make sure that other opinions are not filtered out of view and the web doesn’t turn into a search engine and social media enabled collection of echo chambers where ideological dissent is met with punitive action. But it seems much more likely that we can’t do it through being sneaky with technology. That willingness has to come from the person first and foremost, and that could turn out to be either the easiest thing to change, or the hardest. Giving the open-minded a new option is more than enough, but when dealing with the most close-minded, filter-happy denizens of the web, our only recourse will be to incentivize reading ideologically opposing content. And how does one incentivize open-mindedness?
See: Schwind, C., et al. (2012). Preference-inconsistent recommendations: an effective approach for reducing confirmation bias and stimulating divergent thinking? Computers & Education, 58 (2), 787–796 DOI 10.1016…