the science of confirmation bias

June 9, 2009

In one of Terry Pratchett’s sarcastic allegories, The Truth, the ruler of a vast and powerful city state wondering why in the world anyone needs to start a newspaper, astutely notes that people don’t want to hear something really new and able to upset their worldview. Instead of “news,” he says, people really want “olds,” measured and official confirmations of their preconceived notions. And it seems that according to some recent research, Lord Vetinari was on to something. Not only do people tend to spend more time looking for and reading what supports their current views, they also try to rationalize away well known parodies of their ideologies.

partisan faceoff

According to an Ohio State University study, people spend as much as 36% more time reading essays which agree with their opinions. This statistic was obtained by monitoring controlled surfing habits of 156 students who indentified their positions on hot button culture war issues and were accordingly presented with articles obviously matching their views about 58% of the time. Interestingly, those who seemed to be the most sure in their party affiliation and positions clicked on more links to opposing essays than the average participants. Of those highly confident students there were more conservatives than liberals. Why? The study’s authors would want to do a follow up experiment to pin down those readers’ motivations. My casual guess is that they may be looking through counter-arguments to keep an eye on the opposition or stoke the fires of their partisan fury.

Satire also gets filtered through the ideological prism as another OSU research project indicates. In this case, a sample of 332 self identified liberals and conservatives were shown clips from the Colbert Report, a show that’s intended to parody right wing pundits as confirmed by Colbert himself. Despite that, the conservatives participating in the study tended to say that despite using over the top humor, Colbert actually means what he says about liberals and their positions. The authors write that the show’s deadpan style leaves plenty of room for semantic ambiguity and biased information processing. Conservatives hear something they want to hear and process the comment not as satire but as a humorous agreement.

While these studies focus on political biases in reading choices, I think we can extend some of their findings to engaging with those articles. People with absolute certainty in the correctness of their ideology and with a habit for reading things into posts they find disagreeable are a constant issue for bloggers. When we write something that becomes controversial, we get hit with a string of ridicule that accuses us of virtually anything and everything, insisting we made statements that don’t appear anywhere on the blog and trying to expose a sinister intent behind those non-existent words. But dealing with such cases is just a part of the blogger’s job. If anything, the occasional debate livens things up. And according to the science, they’re virtually inevitable.

See: Knobloch-Westerwick, S., & Jingbo Meng, . (2009). Looking the Other Way: Selective Exposure to Attitude-Consistent Political Information Communication Research, 36 (3), 426-448 DOI: 10.1177/0093650209333030

LaMarre, H., et. al. (2009). The Irony of Satire: Political Ideology and the Motivation to See What You Want to See The International Journal of Press/Politics, 14 (2), 212-231 DOI: 10.1177/1940161208330904

Share on FacebookTweet about this on TwitterShare on RedditShare on LinkedInShare on Google+Share on StumbleUpon
  • This post has been linked for the HOT5 Daily 6/10/2009, at The Unreligious Right

  • Yes, but how do we separate the sheep from the goats? It’s kind of depressing to be a skeptic 24 hrs a day.

  • Goofus Bird

    Louis Farikhan and his NATION OF ISLAM are the biggst bunch of radical extremists are he should just go and take a hike

  • Mark

    “My casual guess is that they may be looking through counter-arguments to keep an eye on the opposition or stoke the fires of their partisan fury.” – This is way too casual for my liking. I’d look at it the other way around: do people who are uncertain of their beliefs (but want to believe) avoid material that would challenge those beliefs?
    That would mean that those that have strong beliefs are less likely to be subject to confirmation bias than “normal”, which is what the study seems to suggest on the face of it.