[ weird things ] | why just showing more data doesn’t slow propaganda on social media

why just showing more data doesn’t slow propaganda on social media

The idea that just giving people more data to make the right choices flies in the face of how humans typically work.
head buried in sand

When the internet became available to the masses, the researchers whose grand ambition it was to get everyone online had fantastic, utopian visions of the future. Buy anything you want online, educate and better yourself with all of humanity’s knowledge at your fingertips, meet and befriend new people and expand your horizons. Connectivity was supposed to wipe out the things that divide us, make us more informed, and build friendships on global scales. Unfortunately, while we got the online shopping part down in short order, today’s internet is awash in racist memes, trolls, and bullshit we thought was supposed to be on its way out. Just look as social media during this election and try not to shudder in disgust. What happened to that noble vision of the web as the great unifier and educator?

Partially, people happened. If we’re going to get real about this, let’s just put it out there. A lot of denizens of social media have about as much interest in bettering themselves or studying an opinion different from theirs as most of us would in showering with a rabid raccoon. But surely, idealistic designers and skeptics say, if we just put an instant fact check of the claims on social media, we could flag things that aren’t true and slow the stream of partisan propaganda calculated to feed on people’s biases, right? One recent version of this argument was a proposal to redesign Facebook to do exactly that for its users, accusing it of abdicating a duty to at least flag the lies, hoaxes, and spin that spread like wildfire through its platform. Unfortunately, that won’t help at all and we know that because scientists have studied the subject and found that technology isn’t the problem, it’s the people.

Now, it’s a valid point that social media exacerbates the problem by giving a lie greater reach and enabling echo chambers, and any attempt to fight this is good. At the same time, those proposing built in fact checks, like Google is now trying to do, are missing the fact that when people are presented with data that challenges their beliefs, they’re actually more likely to reject it. The end result of trying to debate any issue with facts alone leaves the adherents whose worldview you’re challenging sticking to their guns even more, quite literally for certain debates. The reason is that people don’t like dealing with the cognitive dissonance changing one’s views often carries, so they simply refuse to change their views. Many people even find it physically stressful to lose a debate about a deeply held belief and will clutch to it no matter what to cope with that stress. If you’ve ever heard the phrase “you’re not going to talk me out of this,” you’ve seen this in action.

Things are even more dire with conspiracy theorists because they will refuse to let their favorite conspiracies go no matter what, usually by invoking one of the many sub-conspiracies they claim is meant to distract from the main conspiracy, or cry about a coverup. And remember that social media doesn’t want to be in the business of vetting news. Facebook tried and it didn’t end well at all. Imagine it now trying to actually fact check, and showing today’s hardcore Republicans that their favorite pages basically lie to them almost a third of the time. It would make the trending topics skirmish look like a day at the beach by comparison, because, again, the people who follow all these questionable news sources don’t care about fact checks. They’re deliberately choosing to believe something that matches their worldview and no redesign is going to stop them. It will actually make them even more obstinate.

The only way people ever change their minds is when they either don’t have a choice because everything they saw shattered their long held beliefs, of are curious and don’t have much emotional investment in the matter at hand. It really is tempting to look at a problem exacerbated by technology and come up with a technological solution, but that solution doesn’t work then you’re dealing with a human problem rather than a programmatic one. In fact, the notion that the majority of social media users are objective and rely on hard data, so presenting that data to them instantly will make sure they don’t fall for misleading partisan spin designed to play to their biases and deeply held opinions, seems like another deeply held personal belief itself…

# tech // cognitive dissonance / psychology / social media


  Show Comments