Moral Psychology, part 3
Our moral intuitions and emotions are vital -- we can barely function without them. They, more than our rationality, provide us with the basic sense of right and wrong we draw upon at dozens, if not hundreds, of decision points every day. Important as they are, they are limited and sometimes wrong. One problem is that these feelings are highly resistant to reason. Thought experiments like the trolley conundrum illustrate that a feeling of the wrongness of an action is hard to shake, even for a hypothetical scenario in which you are repeatedly reminded that one of the conditions is that you know that the counter-intuitive action would have the better results.
While reason won't sway moral intuitions much, nonrational factors do so with worrisome ease. In Rob Drummond's show “The Majority,”
“The votes are interspersed with Drummond’s narrative, . . . about how he got involved with the anti-fascism movement and ended up being arrested for punching a white supremacist.” (Sophie Gilbert, Atlantic)Later, Drummond shares his “disgust with himself for, as he puts it, ‘punching a man for having an opinion.’” He asks again whether it’s okay to abuse someone for something they personally believe, and this time 87.6 percent say no. He has, essentially, converted the audience. And the ease with which he’s done it is, muses Sophie Gilbert,
“yet another unnerving element to bolster his argument — that few of us really know or deeply consider what we’re voting for.”Our moral decisions -- sometimes even life or death decisions -- aren’t very carefully considered. Studies have revealed:
- People shown a comedy clip, and then asked whether they’d push the large man to his death, were more likely to approve killing the large man than those in another group that was shown a “tedious documentary about a Spanish village.” (Sarah Bakewell, New York Times)
- We are more generous toward a stranger if we have just found a dime.
- A judge’s decision to grant parole depends on how long it has been since he or she had lunch.
- Subjects asked to make judgments about controversial moral questions -- for example, marriage between first cousins or the making of a documentary in which people were tricked into being interviewed -- make harsher moral judgments if they are standing next to a smelly trash can than if they are not. The brain more easily finds behavior morally disgusting if the disgust reaction is already given a little jump start.
If moral assessment is so easily manipulated, that means voting behavior is too. What candidates actually say in the course of a campaign has little impact on voters. Most voters just take in the vibe and let their unconscious biases pull them toward trust or distrust. Completely irrelevant factors can shift how a voter feels about the world, and a good feeling about the world benefits the incumbent.
- In areas with a strong college football fan base, it was found that if their team won its most recent game before the November election, the incumbent candidate got a big boost.
In the end, reason is our only hope. It’s slow, and its plodding, and it’s pretty useless for day-to-day moral decision-making, where our emotional habits of right and wrong guide us, and our biases are just consistent enough to lend us an appearance of integrity. In fact, reason often takes generations to successfully call into question an unquestioned moral habit.
Firm moral habits widely held 150 years ago upheld slavery, those of 100 years ago denied women the right to vote, those of 50 years ago denied marriage equality to same-sex couples. Pressing questions that asked for rational justification of these habits gradually shifted them. Are darker skinned humans really so different that enslaving them is reasonable? Why shouldn’t women vote? Why shouldn’t gay people marry and have family lives? Repeatedly pressing such questions eventually pushes human brains to draw on parts other then the ventromedial prefrontal cortex in the quest to articulate defensible reasons.
A question now beginning to be pressed in public discourse is: Why exactly do we still have Confederate monuments?
I’m not keen to question all of our moral shortcuts. I want a life that feels sacredness, even if there is no good reason for it – so I have some empathy for those who find sacredness in a statue of Robert E. Lee. I want a life where at least some of my loyalty can be taken for granted without being subject to a need to rationally justify it. I want a life where I trust authorities, like scientists who tell me about climate change when I can’t make the calculations myself. We need our moral shortcuts. For all their irrationality, fickleness, and manipulability, we’d be uprooted, adrift, and lost without them.
The spiritual lesson of this is, first, humility. What feel like our strongest convictions aren’t so strong in a different set of circumstances, and what seems like ironclad logic is probably an illusion of rationalization. So let us hold our own opinions with humility.
Second, relatedly, let us try to have empathy for those with whom we disagree. Yes, their brains have built-in biases. Our brains were built the same way and are just as biased. Where we have no certain access to truth, let us seek to replace the urge to be right with the call to love.
And let us try – not all the time, but sometimes – to take the time to take the slow road of entertaining hard questions of whether our own most precious moral intuitions really are justified.
And one other thing: maybe visit the Against Malaria Foundation website, and send them a donation. I now have.
* * *
This is part 3 of 3 of "Moral Psychology"
See also
Part 1: Do We Want Our Moral Intuitions to Be Rational?
Part 2: Care, Loyalty, Authority, Fairness, Liberty, and Sacredness
No comments:
Post a Comment