I was e-mailing this morning with a secular atheist liberal writer acquaintance who is working on a piece about why so many fellow secular liberals refuse, in her view, to face the plain facts about Islamism and Islamic radicalism. I mentioned to her some of my experiences with fellow journalists, who were absolutely immovable on the subject, no matter how many facts and factual analyses I would provide them. It was almost as if the more facts I presented, the harder they dug into their untenable positions. Of course, conservatives, including religious conservatives, are susceptible to exactly the same process; I have observed that as well, in crazy-making arguments with other conservatives, one of whom, in a memorable exchange, denied that objective, verifiable facts were facts, saying that they were mere opinion. I like to think of myself as someone who tries to be aware of his own biases and how they distort my own interpretation of the facts, but I’m sure that I’m guilty of the same thing at times.
I’m somwhat more sure of this this morning because Reader mm sent along a report on how scientists have found that this kind of thing is part of the human consciousness. Excerpt:

Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.
“The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as “backfire” — is “a natural defense mechanism to avoid that cognitive dissonance.”
These findings open a long-running argument about the political ignorance of American citizens to broader questions about the interplay between the nature of human intelligence and our democratic ideals. Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we’re right, and even less likely to listen to any new information. And then we vote.

More:

There is a substantial body of psychological research showing that people tend to interpret information with an eye toward reinforcing their preexisting views. If we believe something about the world, we are more likely to passively accept as truth any information that confirms our beliefs, and actively dismiss information that doesn’t.

There seems to be no way to eliminate this hard-wired tendency towards confirmation bias. But it can be fought if we are aware of it in ourselves, and work to overcome it. Doing so, though, requires a humble approach to knowledge — that is, recognizing that we don’t know everything, and that we very well might be wrong. We live in a media environment and political environment that doesn’t reward that kind of reflection (“the worst are full of passionate intensity”). We become absolutely certain of our own rectitude and correct judgment, which is one reason, I think, we see public life as an occasion for scalp-taking (see the Octavia Nasr flap, and the Catholic professor situation). You find among many partisans zero grace toward the Other, no attempt at understanding, or empathy, and no willingness to give somebody the benefit of the doubt. The tribal satisfactions of purging and punishing the Other are too pleasurable to resist.

More broadly, the Boston Globe piece I linked to here raises depressing questions about the future of democracy. None of this is exactly new; we’ve known for a long time that people are susceptible to the well-told and pleasing lie over the difficult, painful or inarticulately put truth. What is new, or at least new-ish, is the surrender by many Americans of the idea that we ought to try to overcome those instincts that prevent us from reasoning clearly. And, the humility that comes from our personal awareness of our own fallibility, which ought to engender grace and tolerance toward opponents. People aren’t rewarded much anymore for being thoughtful and generous to others. Far from being ashamed of their ignorance, folks — even the most intellectually sophisticated, educated folks — are becoming proud of it.
One of the things that has impressed me most about the legacy of the late Sir John Templeton, as I’ve come to read about him since joining his Foundation, is the place he gave humility in his approach to knowledge. He said often that if we weren’t willing to admit squarely that we didn’t know something, or that our conclusions might be based on incomplete evidence, and an unwillingness to consider evidence that goes against our own biases, we cannot hope to advance in knowledge. I can see the the worst intellectual mistakes I’ve ever made occurred when I was so sure of myself, and thought that my refusal to consider alternatives was not a matter of the blindness of pride, but of steely, admirable conviction. I turned out not to be wrong, but badly wrong. I did not know what I didn’t know, and more importantly, wasn’t prepared to accept the possibility that I was wrong. Indeed, I looked at people on my own side who were ambivalent, and thought they were suffering from a failure of moral courage. In fact, they were braver than I.
I should add that it is not the case that we should become mush-minded relativists, wringing our hands over everything, unable to decide right from wrong, or to act on what we are reasonably sure is the correct analysis. Obviously there are cases in which we can and we must act with intellectual confidence and moral resolve. The problem is those cases are almost certainly rarer than we think.
More from Beliefnet and our partners
Close Ad