Monday, July 20, 2009

Can Factual Information Change People's Minds?

 
Can factual information change people's minds? Most people assume the answer is "yes." After all, if people believe something that isn't true then exposing them to the truth should cause them to abandon their beliefs, right?

Wrong. There's plenty of evidence that life is much more complicated. An interesting posting on MotherJones.com entitled The Backfire Effect, alerts us to a study suggesting that knowledge may even have the opposite effect to what you expect. (Hat Tip: Canadian Cynic)

The paper is here.

When Corrections Fail:
The persistence of political misperceptions

Brendan Nyhan and Jason Reifler

The authors review the literature and conclude that substantial numbers of people are quite resistant to facts when they hold strong opinions. Surprisingly, some people actually become more convinced they are right after hearing facts that contradict their belief. This phenomenon, called "The Backlash Effect," is actually familiar to us in another context. One example given in the paper is, "... that hearing a Democrat argue against using military force in some cases causes Republicans to become more supportive of doing so."

In one of the studies conducted by Nyhan and Reifler, students were divided into two groups. Both groups read a news report quoting from a speech by President Bush in October 2004—six months after the invasion of Iraq. One group read an extended news report describing the Duelfer Report, which all but proved that Iraq did not have weapons of mass destruction before the war began. The additional information counts as "the correction."

Students were then asked whether they agreed with the following statement.
Immediately before the U.S. invasion, Iraq had an active weapons of mass destruction program, the ability to produce these weapons, and large stockpiles of WMD, but Saddam Hussein was able to hide or destroy these weapons right before U.S. forces arrived.
Here's the result ...
For very liberal subjects, the correction worked as expected, making them more likely to disagree with the statement that Iraq had WMD compared with controls. The correction did not have a statistically significant effect on individuals who described themselves as liberal, somewhat left of center, or centrist. But most importantly, the effect of the correction for individuals who placed themselves to the right of center ideologically is statistically significant and positive. In other words, the correction backfired – conservatives who received a correction telling them that Iraq did not have WMD were more likely to believe that Iraq had WMD than those in the control condition.
I'm pretty skeptical about these sorts of studies because there are so many variables and the sample sizes are quite small. Nevertheless, this "backfire effect" makes some sense given my own experience in trying to debate various issues.

I exhibit it myself sometimes. Faced with opponents who are vigorously disagreeing with me I can feel myself being driven to a hardened, more extreme, position than I would otherwise hold. In other words, when presented with uncomfortable facts that contradict my point of view, I sometimes work even harder to refute or rationalized those facts. That's more comforting than being forced to admit I'm wrong.

My opponents do this too. In fact, they do it far more often than I do because they are far more likely to start off being wrong.

The bottom line is that you have to be careful to remain objective in the face of factual information. Be prepared to re-evaluate your position if the facts are against you. And don't assume that your opponents will be swayed by correcting their misperceptions. That's only the first step toward changing their minds.

The paper goes on to describe other studies and it discusses possible explanations. It's a good read and I recommend it.


No comments:

Post a Comment