Truth. Who Needs It? part 1
from Ben Yagoda, "The Cognitive Biases Tricking Your Brain," Atlantic Magazine, 2018 Sep
Wikipedia’s “List of cognitive biases” contains 185 entries, from actor-observer bias (“the tendency for explanations of other individuals’ behaviors to overemphasize the influence of their personality and underemphasize the influence of their situation … and for explanations of one’s own behaviors to do the opposite”) to the Zeigarnik effect (“uncompleted or interrupted tasks are remembered better than completed ones”).You probably recognize many of those cognitive biases Ben Yagoda and Wikipedia mentioned. And you probably recognize them because you’ve seen them in other people. They are harder to detect in ourselves. We humans are expert reasoners when it comes to spotting flaws in someone else’s argument. The positions we’re blind about are our own. For instance, when I mention to people that we’re good at seeing the weakness in other people’s arguments but lousy at noticing the gaps in our own reasoning, the most common response I get is: “Oh, yeah, I know a lot of people like that.”
Some of the 185 are dubious or trivial. The Ikea effect, for instance, is defined as “the tendency for people to place a disproportionately high value on objects that they partially assembled themselves.” And others closely resemble one another to the point of redundancy. But a solid group of 100 or so biases has been repeatedly shown to exist, and can make a hash of our lives.
The gambler’s fallacy makes us absolutely certain that, if a coin has landed heads up five times in a row, it’s more likely to land tails up the sixth time. In fact, the odds are still 50-50.
Optimism bias leads us to consistently underestimate the costs and the duration of basically every project we undertake.
Availability bias makes us think that, say, traveling by plane is more dangerous than traveling by car. (Images of plane crashes are more vivid and dramatic in our memory and imagination, and hence more available to our consciousness.)
The anchoring effect is our tendency to rely too heavily on the first piece of information offered, particularly if that information is presented in numeric form, when making decisions, estimates, or predictions. This is the reason negotiators start with a number that is deliberately too low or too high: They know that number will “anchor” the subsequent dealings. A striking illustration of anchoring is an experiment in which participants observed a roulette-style wheel that stopped on either 10 or 65, then were asked to guess what percentage of United Nations countries is African. The ones who saw the wheel stop on 10 guessed 25 percent, on average; the ones who saw the wheel stop on 65 guessed 45 percent. (The correct percentage at the time of the experiment was about 28 percent.)
The effects of biases do not play out just on an individual level. Last year, President Donald Trump decided to send more troops to Afghanistan, and thereby walked right into the sunk-cost fallacy. He said, “Our nation must seek an honorable and enduring outcome worthy of the tremendous sacrifices that have been made, especially the sacrifices of lives.” Sunk-cost thinking tells us to stick with a bad investment because of the money we have already lost on it; to finish an unappetizing restaurant meal because, after all, we’re paying for it; to prosecute an unwinnable war because of the investment of blood and treasure. In all cases, this way of thinking is rubbish.
. . . the endowment effect . . . leads us to place an irrationally high value on our possessions. In [one] experiment half the participants were given a mug and then asked how much they would sell it for. The average answer was $5.78. The rest of the group said they would spend, on average, $2.21 for the same mug. This flew in the face of classic economic theory, which says that at a given time and among a certain population, an item has a market value that does not depend on whether one owns it or not.
If I had to single out a particular bias as the most pervasive and damaging, it would probably be confirmation bias. That’s the effect that leads us to look for evidence confirming what we already think or suspect, to view facts and ideas we encounter as further confirmation, and to discount or ignore any piece of evidence that seems to support an alternate view. Confirmation bias shows up most blatantly in our current political divide, where each side seems unable to allow that the other side is right about anything. Confirmation bias plays out in lots of other circumstances, sometimes with terrible consequences. To quote the 2005 report to the president on the lead-up to the Iraq War: “When confronted with evidence that indicated Iraq did not have [weapons of mass destruction], analysts tended to discount such information. Rather than weighing the evidence independently, analysts accepted information that fit the prevailing theory and rejected information that contradicted it.”
Well, we are gathered for spiritual sustenance and spiritual challenge. Spiritually, what do we do with these facts about ourselves? There are two ways we might go from here.
I might talk about cultivating humility, developing a habit of doubting my own conclusions, holding my opinions lightly, and never believing what I think. I might talk about how to train and practice at spotting our own cognitive biases.
Or, I might take a different approach. I might say, you know what? Let’s just give up on that. It can’t happen. The biases built into our reasoning processes are inherent. They’re not fixable. One of the things we do spiritually is celebrate ourselves – affirm our worth and dignity, the beauty and wonder of the amazing life forms that we are. So let’s celebrate our cognitive biases because that’s who we are as humans. We are apes who search out and latch onto any information that seems to confirm what we already believe; we overlook or ignore information that suggests otherwise; and once we get a notion into our heads, it’s almost impossible to dislodge it. We rely on emotional reactions and heuristic shortcuts because: who’s got time to time think for themselves and carefully analyze the data for accuracy and implications? It’s not that we’re lazy, it’s that we’re busy. We got things to be doing. Let’s celebrate how amazingly productive we are!
The cognitive biases provide us with shortcuts, and yes, sometimes the shortcuts bypass, well, the truth, that is, the conclusion we would come to with a more careful and objective analysis of the evidence. But they are worth it. Our emotional reactions and our heuristic shortcuts help us connect to each other, form community, and facilitate our fantastic productivity. The occasional negative effects of cognitive bias are usually negligible, only rarely disastrous, and most often help us get along in our relationships and progress through our tasks.
Anyway, brain studies indicate that we get a rush of dopamine when we are processing information that supports our beliefs. “It feels good to ‘stick to our guns,’ even if we are wrong.” (Jack and Sarah Gorman)
We see the speck in our neighbor’s eye but do not notice the log in our own eye because we were built to do that, and not one of us can help it. This is not a bug in the way our brains are wired. It’s a feature. Sure, there’s a shadow side to it, but it evolved for a reason. There are good reasons for having bad reason, so hooray for human cognitive biases!
Let's celebrate what we are! There isn't much, but there is a little bit, we can do to mitigate the cognitive biases, and I'll talk about that later.
We humans are not merely a social species, but ultrasocial – a level achieved only by a handful of species, mostly insects like ants, termites, and bees. Chimps, for instance, are highly social -- but they aren’t ultrasocial. Primatologist Michael Tomasello gave this illustration: “It is inconceivable that you would ever see two chimpanzees carrying a log together.” But you will see ultrasocial species -- like ants or humans -- carrying something together.
At some point in about the last million years, our ancestors developed shared intentionality – that is, the ability to share mental representations of a task so that multiple people can work on it. Take something as seemingly simple as one person pulling down the branch for the other to pluck the fruit, and then both of them share the meal. Chimps never do this.
We are profound collaborators, connecting our brains together to solve problems that single brains can’t. We distribute the cognitive tasks. No individual knows everything it takes to build a cathedral, or an aircraft. Our species success comes not from individual rationality but from our unparalleled ability to think in groups. Our great glory is how well we rely on each other’s expertise.
We rely on it so smoothly that we assume that we understand things ourselves that we have let others work out. Take zippers. Or toilets. Or cylinder locks – the sort of lock you probably have on your front door. Do you know how zippers, toilets, and cylinder locks work?
A study at Yale asked graduate students to rate how well they understood these everyday devices. Most of them rated their understanding pretty high. They were then asked to write detailed, step-by-step explanations of how the devices worked. Forced to spell out the details, they realized there were some key details they were pretty fuzzy on. Asked again to rate their understanding of these devices, they rated themselves lower. (Sloman and Fernbach)
This illusion of explanatory depth allows me to take for granted what other people know and frees me from having to, as we like to say, re-invent the wheel. Our vast and complex collaboration depends on “not having to think about it” — that is, not having to think about most things so that my neurons can focus on what I am contributing -- so that others don’t have to think about that.
* * *
This is part 1 of 3 of "Truth. Who Needs It."
See also
Part 2: When Truth Stopped Mattering
Part 3: If You Want Truth, Build Trust
No comments:
Post a Comment