Behavioral economics in one chart

It’s sometimes claimed, not entirely unreasonably, that the research on cognitive biases amounts to an unwieldy laundry list. Just look at how long the list of cognitive biases is on Wikipedia. This frustration is usually paired with some other criticism of the field: that results don’t replicate, or that there’s no underlying theory, or that it’s a mistake to benchmark the human brain against the so-called “rational ideal.”

I’m not very moved by these critiques; if the list of biases is long, it’s partly because the psychology of heuristics and biases is appropriately a very empirical field. An overarching theory would be nice, but only if it can explain the facts. The whole point of behavioral economics was to correct the fact that economists had let theory wander away from reality.

Still, I’ve been thinking recently about how to sum up the key findings of behavioral science. What’s the shortest possible summary of what we know about bias and decision making?

Enter Decision Leadership by Don Moore and Max Bazerman. They wrote the textbook on decision making, and in this book they offer advice on how to make good decisions across an organization. I’ve had the pleasure to work with them and I recommend the book.

What I want to share here isn’t their advice, but their succinct summary of decision biases, from the book’s appendix. It’s the best synthesis of the field I know of:

p. 196

Here’s a bit more.

The human mind, for all its miraculous powers, is not perfect. We do impressively well navigating a complex world but nevertheless fall short of the rational ideal. This is no surprise–perfect rationality assumes we have infinite cognitive processing capacity and complete preferences. Lacking these, we adapt by using shortcuts or simplifying heuristics to manage challenges that threaten to exceed our cognitive limitations. These heuristics serve us well much of the time, but they can lead to predictable errors. We group these errors into four types based on the heuristics that give rise to them.

p. 195

The availability heuristic

The first is the availability heuristic, which serves as an efficient way of dealing with our lack of omniscience. Since no one knows everything, we rely instead on the information that is available to us… Reliance on the availability heuristic gives recent and memorable events outsize influence in your likelihood judgments. After experiencing a dramatic event such as a burglary, a wildfire, a hurricane, or an earthquake, your interest in purchasing insurance to protect yourself is likely to go up… The availability heuristic leads us to exaggerate vivid, dramatic, or memorable risks–those we can easily retrieve from memory… The availability heuristic biases all of us toward overreliance on what we know or the data we have on hand. Sometimes information is easier to recall because it is emotionally vivid, but other information is privileged simply due to the quirks of human memory… You can reduce your vulnerability to the availability bias by asking yourself what information you would like to have in order to make a fully informed decision–and then go seek it out.

p. 195-198

The confirmation heuristic

The second is the confirmation heuristic, which simplifies the process by which we gather new information… One of the challenges that impedes us in seeking out the most useful evidence to inform our decisions is that confirmation is so much more natural than disconfirmation… Even our own brains are better at confirmation than disconfirmation. We are, as a rule, better at identifying the presence than the absence of something. Identifying who is missing from a group is more difficult than determining who is present. That has the dangerous consequence of making it easier to find evidence for whatever we’re looking for… Confirmation can bias our thought processes even when we are motivated to be accurate. It is even more powerful when ti serves a motivation to believe that we are good or virtuous or right. Virtue and sanctimony can align when we defend our group and its belief systems. The motivation to believe that our friends, leaders, and teachers are right can make it difficult to hear evidence that questions them… The automatic tendency to think first of information that confirms your expectations will make it easy for you to jump to conclusions. It will make it easy for you to become overconfident, too sure that the evidence supports your beliefs… If you want to make better decisions, remind yourself to ask what your critics or opponents would say about the same issue.

p. 195, 199-202

The representativeness heuristic

The third is the representativeness heuristic, which stands in for full understanding of cause and effect relationships. We make assumptions about what causes what, relying on the similarity between effects and their putative causes…

p. 195

This is a tricky one so I want to step outside the book for a second and supplement the definition above with the definition from the American Psychological Association:

representativeness heuristic: a strategy for making categorical judgments about a given person or target based on how closely the exemplar matches the typical or average member of the category. For example, given a choice of the two categories poet and accountant, people are likely to assign a person in unconventional clothes reading a poetry book to the former category; however, the much greater frequency of accountants in the population means that such a person is more likely to be an accountant.

The framing heuristic

Finally, framing helps us establish preferences, deciding how good or bad something is by comparing it to alternatives… Context frames our preferences in important ways. Frames drive our choices in ways that rational theory would not predict. We routinely behave as if we are risk averse when we consider choices about gains but flip to being risk seeping when we think about the same prospect as a loss. This reversal of risk preferences owes itself to the fact that we think about gains and losses relative to a reference point–usually the status quo or, in the case of investments, the purchase price… One [consequence] is the so-called endowment effect, our attachment ot the stuff we happen to have… The endowment effect can contribute to the status quo bias, which leads us to be irrationally attached to the existing endowment of possessions, privileges, and practices.

p. 195, 205-209

Summing it up

So, what’s the shortest version of behavioral economics and decision making? We are not perfectly rational and in particular we’re intuitively quite bad at statistical and probabilistic thinking. We rely heavily on the information we can easily recall, and we look for reasons to confirm what we already think–especially when doing so protects our self-conception or our group’s status. When we think about cause and effect, we rely on the information that we can easily recall, we don’t think carefully about probability and counterfactuals–and instead, we think in terms of stories and archetypes. We put things into categories and construct causal narratives based on the category. And we are creatures of context: Our frame of reference can sometimes shift based on what is made salient to us, and we are often especially attached to the status quo.

There are a lot of narrower biases within these four heuristics, and no doubt there’s plenty to quibble with in any specific taxonomy like this one. But in my book that’s a pretty decent starting point for summing up a wide set of empirical work, and it clearly helps explain a lot of how we think and how we decide.

Leave a comment

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s