On the Ezra Klein Show last year, Phil Tetlock (being interviewed by Julia Galef) described how good forecasters integrate multiple perspectives into their own:
JULIA GALEF: So we’ve kind of touched on a few things that made the superforecasters super, but if you had to kind of pick one or two things that really made the superforecasters what they were, what would they be?
PHIL TETLOCK: We’ve already talked about one of them, which is their skill at balancing conflicting arguments, their skill of perspective taking. However, although, but. They put the cognitive breaks on arguments before arguments develop too much momentum. So they’re naturally inclined to think that the truth is going to be some blurry integrative mix of the major arguments that are in the current intellectual environment, as opposed to the truth is going to be way, way out there. Now, of course, if the truth happens to be way, way out there, and we’re on the verge of existential catastrophe, I’m not going to count on them to pick it up.
JULIA GALEF: In addition to these dispositions and sort of general thinking patterns that the superforecasters had, are there any kind of concrete habits that they would always or often make use of when they were trying to make a forecast that other people could adopt to?
PHIL TETLOCK: One of them is this tendency to be integratively complex and qualify your arguments, howevers and buts and all those, a sign that you recognize the legitimacy of competing perspectives. As an intellectual reflex, you’re inclined to do that. And that’s actually a challenge to Festinger and cognitive dissonance. They’re basically saying, look, these people have more tolerance for cognitive dissonance that Leon Festinger realized was possible.(Emphasis mine.)
Cognitive dissonance is the state of having inconsistent beliefs. Tetlock is saying that good forecasters are more willing than most to have inconsistent beliefs. (In his book Superforecasting he uses the term “consistently inconsistent.”)
How could inconsistency be a good thing? Well, as he says, the integrative mindset tends to think “that the truth is going to be some blurry integrative mix of the major arguments.”
You could imagine two different ways of integrating seemingly disparate arguments or evidence. Say someone shows evidence that raising the minimum wage caused job losses in France (these are made up examples). And someone else showed evidence that a higher minimum wage didn’t lead to any job losses in the U.S. Say you think in both cases the evidence is high quality. How do you integrate those two views?
One way would be to try and think of reasons why they could both be true: What’s difference about France and the U.S. such that the causal arrow might reverse in the two cases? That, I think, is a form of the integrative mindset. You’re trying to logically “integrate” two views into a consistent model of the world.
But the other integrative approach is basically to average the two pieces of evidence: to presume that on average the answer is in the middle, that maybe minimum wage hikes cause modest job losses. That is a “blurry integrative mix,” and it’s not super rigorous. But it often seems to work.
For the rest of the post I want to just quote a couple other descriptions of integrative thinking…
How Politifact, the fact-checking organization, “triangulates the truth”:
PolitiFact items often feature analysis from experts or groups with opposing ideologies, a strategy described internally as “triangulating the truth.” “Seek multiple sources,” an editor told new fact-checkers during a training session. “If you can’t get an independent source on something, go to a conservative and go to a liberal and see where they overlap.” Such “triangulation” is not a matter of artificial balance, the editor argued: the point is to make a decisive ruling by forcing these experts to “focus on the facts.” As noted earlier, fact-checkers cannot claim expertise in the complex areas of public policy their work touches on. But they are confident in their ability to choose the right experts and to distill useful information from political arguments.
Roger Martin, in HBR in 2007, says great leaders are defined by their ability “to hold in their heads two opposing ideas at once.”
And then, without panicking or simply settling for one alternative or the other, they’re able to creative resolve the tension between those two ideas by generating a new one that contains elements of the other but is superior to both. This process of consideration and synthesis can be termed integrative thinking.