Chris Mooney had a great piece at Mother Jones recently that has been making the rounds. The title is “The Science of Why We Don’t Believe in Science” and it’s a good primer on some of the literature on how we rationalize to protect our biases and more generally our worldview. If you haven’t read it yet I highly recommend it. Here’s the gist:
Consider a person who has heard about a scientific discovery that deeply challenges her belief in divine creation—a new hominid, say, that confirms our evolutionary origins. What happens next, explains political scientist Charles Taber  of Stony Brook University, is a subconscious negative response to the new information—and that response, in turn, guides the type of memories and associations formed in the conscious mind. “They retrieve thoughts that are consistent with their previous beliefs,” says Taber, “and that will lead them to build an argument and challenge what they’re hearing.”
In other words, when we think we’re reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt : We may think we’re being scientists, but we’re actually being lawyers  (PDF). Our “reasoning” is a means to a predetermined end—winning our “case”—and is shot through with biases. They include “confirmation bias,” in which we give greater heed to evidence and arguments that bolster our beliefs, and “disconfirmation bias,” in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial.
This is related to the concept of “sacred beliefs” that I’ve been harping on lately and the general point I want to make is that the problem Mooney sketches out can be viewed as a challenge that media can help to overcome. What if the media you’re consuming knew from your history and profile that you had certain biases, and therefore presented the information in a way that makes it easier for you to overcome those biases?
There are clearly a number of challenges here. Determining bias is tricky in the first place, because it has to be done in reference to some “truth”, which, given the nature of the problem, is likely controversial. And even once that is done there would need to be a way to measure progress in overcoming biases. But we take for granted that digital media offers the opportunity to design experiences that are customized and interactive in a way newspapers and other “old media” are not. Why not focus on cognitive personalization aimed at helping us think more rationally?
At what point would a “cognitive personalization” system’s results be more valuable to someone than the comfort of their “sacred beliefs”? Is there a reason “sacred beliefs” are so sacred? Don’t they serve a personal/societal value?
It seems like their’s an initial step required: admitting you have a rationalization problem with your media. Perhaps with a cognitive personalization system, following the 12 step guide to rational media consumption program (aka. Sacred Beliefs Anonymous) one would be able to fire the lawyer in them.
Yeah good points Jon. Sacred beliefs are totally valid, but not when they seep into the realm of empirical beliefs, like we discussed today. You’re welcome to believe that the government shouldn’t be interfering in peoples’ lives. Say that’s your sacred value. The problem is when that value means you won’t accept the science of climate change because on some level you fear that accepting it would imply a challenge to that sacred value.
I love Sacred Beliefs Anonymous! (SBA) and the idea of “firing” the lawyer inside!
But I think one thing Mooney explains well is that this stuff is ingrained in how the brain works. So it’s not something any of us will overcome once and for all, but rather a constant struggle to do better.
And that’s the context in which I think we can start designing systems and embedding them in our media. Just as scientists use peer review under the assumption that that system will eventually produce more accurate results, we ought to systematize the way we take in controversial information.
Leave a comment