Sign up to get Steve Denning's FREE newsletter

 

You'll get tips, tricks and advance chapters from Steve's forthcoming book. Click here to sign-up for newsletter.

Why Giving Reasons Doesn't Work

The pitfalls of the traditional approach to communication.

The conventional wisdom -- the Western intellectual tradition -- is that you change people's minds by giving them reasons. The reality is that this approach is not only ineffective. It's actually counter-productive. Here's why.

From chapter 1 of The Secret Language of Leadership (Jossey-Bass, 2007)

Think back for a moment to the last memo or essay or journal article you wrote, or the last time you gave a presentation. If you followed the traditional model of communication, you went through a familiar trinity of steps.

You stated the problem you were dealing with. Then you analyzed the options. And your conclusion followed from your analysis of the options.

Define problem>> Analyze problem>>Recommend solution

If this was your model, it wasn’t unusual. You were doing what has always been done in organizations or universities. It’s the "normal," the "commonsense," the "rational" way of communicating. It’s an appeal to reason—a model that has been the hallowed Western intellectual tradition ever since the ancient Greeks. It reached its apogee in the twentieth century. And it works well enough when the aim is merely to pass on information to people who want to hear it.

But if you’re trying to get human beings to change what they are doing and act in some fundamentally new way with sustained energy and enthusiasm, it has two serious problems. One, it doesn’t work. And two, it often makes the situation worse.

Giving reasons for change to people who don’t agree with you isn’t just ineffective. A significant body of psychological research shows that it often entrenches them more deeply in opposition to what you are proposing.

In 1979, a psychologist named Charles Lord and his colleagues at Stanford University published their classic research on what happens when people are presented with arguments that are at odds with what they currently believe.10 Lord’s team selected twenty-four proponents and twenty-four opponents of capital punishment. They showed them studies that confirmed the penalty’s deterrence as well as other studies that refuted it. What happened? The proponents of capital punishment interpreted the studies as supporting capital punishment, while the opponents of capital punishment concluded that the evidence refuted the approach. Both proponents and opponents found clever ways to reinterpret or set aside any contrary evidence so as to confirm their original positions.

For instance, whereas a participant in favor of capital punishment commented on a study confirming the deterrence effect that "the experiment was well thought out, the data collected was valid, and they were able to come up with responses to all criticisms," an opponent of capital punishment said of the same study, "I don’t feel such a straightforward conclusion can be made from the data collected."

On another study showing the opposite, that is, disconfirming the deterrence effect, the roles were reversed. The opponent’s meat became the proponent’s poison and vice versa. The end result was that the proponents and the opponents of capital punishment became even more set in their positions. After they had reviewed the evidence, they were more polarized than before.

The phenomenon, which psychologists call the confirmation bias, was noted by Francis Bacon almost four hundred years ago: "The human understanding when it has once adopted an opinion . . . draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects and despises, or else by some distinction sets aside and rejects, in order that by this great and pernicious predetermination the authority of its former conclusions may remain inviolate."

The confirmation bias isn’t entirely illogical. Thus when I glance at a tabloid at the supermarket and read the headline, "Scientists Discover 4,000-Year-Old Television Set in Egyptian Pyramid," I smile and question the reliability of the tabloid, not my belief as to when television was invented. When we think we know something to be objective truth, our immediate reaction to news indicating the opposite is to jump to the conclusion that there must be something wrong with the source. And for many purposes, the confirmation bias serves us well.

But why aren’t we more willing to reconsider our positions in the face of serious factual evidence that should at least give us pause? Aren’t we thinking at all? Apparently not, according to a recent study by psychologist Drew Westen and his team at Emory University.12 The team conducted Functional Magnetic Resonance Imaging (fMRI) brain scans on fifteen "strong Republicans" and fifteen "strong Democrats" in the course of the 2004 presidential campaign while they were reviewing blatantly self-contradictory statements by the two candidates, George W. Bush and John Kerry. As we would expect from earlier studies of the confirmation bias, the Democrats found ways to reconcile Kerry’s inconsistencies and became even more strongly Democrat, while the Republicans had no difficulty explaining away George W. Bush’s self-contradictions so as to become even more fervently Republican.

But the fRMI brain scans showed something new. While the participants were considering the inconsistent statements, the part of the brain associated with reasoning revealed no signs of activity at all. "We did not see," said Westen, "any increased activation of the parts of the brain normally engaged during reasoning. What we saw instead was a network of emotion circuits lighting up, including circuits hypothesized to be involved in regulating emotion and circuits known to be involved in resolving conflicts."

But there was something even more startling. Once the participants had seen a way to interpret contradictory statements as supporting their original position, the part of the brain involved in reward and pleasure became active, and the conclusion was "massively reinforced . . . with the elimination of negative emotional states and the activation of positive ones."

Remember that involuntary smile that sprang to my lips when I read the headline about the 4,000-year-old TV in the Egyptian pyramids? That smile wasn’t as innocent as it looked. My brain was giving itself a psychic reward for having been able to stick to its original position. The emotional reaction, not my thinking mind, was causing me to be even more passionately attached to my original belief.

The confirmation bias helps explain why the traditional approach of trying to persuade people by giving them reasons to change isn’t a good idea if the audience is at all skeptical, cynical, or hostile.  If a leader offers reasons at the outset of a communication to such an audience, the maneuver will likely activate the confirmation bias and the reasons for change will be reinterpreted as reasons not to change. This occurs without the thinking part of the brain being activated: the audience becomes even more deeply dug into its current contrary position. Reasons don’t work at the outset, because the audience is neither listening nor thinking.

Worse, we also know that skepticism and cynicism are contagious and can quickly turn into epidemics. They are instances of rebellious, antisocial behavior. In The Tipping Point, Malcolm Gladwell has described how such epidemics occur in many different settings.15 We see it with hooligans. We see it with teenage smoking. When one person in a group is openly skeptical or cynical, it can create a license for others to be likewise: being a skeptic or a cynic can quickly become the cool thing. In the bar, after work, if the coolest person in the group says that the presentation that day was pure BS, how many others in the group are going to take the social risk of saying that they thought the presentation made a lot of sense? If they were thoroughly convinced, maybe. But if they themselves found the presentation confusing and hard to understand, the risk is that they’ll go along with the cool guy, and agree that yes, it was all BS.

So although we might imagine that giving a presentation discussing and analyzing problems and reaching rational conclusions in favor of change can’t do any harm, we need to think again. Giving a lecture full of abstract reasons arguing for change can quickly turn an audience into an army of strident cynics.

 


Read the Introduction
Watch the video
& pick up these amazing gifts!

Join our on-line
discussion group:

"Revolutionizing
the World
of Work"