While new atheists such as Daniel Dennett, Richard Dawkins, and Victor Stenger generally adhere to the norms of civil academic discourse, others have taken an overly aggressive and often condescending approach that maligns people of faith as thoughtless, gullible, or irrational. Even the American Humanist Association succumbed to this tendency recently by sponsoring a billboard that features a child who snubs God’s invitation with the remark, “I’m getting a bit old for imaginary friends.” The implication is that religious adults are even less emotionally and intellectually mature than this child. Indeed, it suggests, how can any rational, educated person possibly consider taking such transparently illusory beliefs seriously?

This belittling of people of faith certainly doesn’t help atheists’ current public relations problem. In a paper published last year in the Journal of Personality and Social Psychology, for example, the authors found that wherever there are religious majorities, atheists consistently rank as the “least trusted” among minority groups. According to a recent poll conducted by researchers at the University of Minnesota, respondents identified atheists as the minority group least likely both to represent their vision of America and to offer suitable marriage partners for their children.

At the risk of being criticized by some in the secular humanist community for “going soft” on religion, I want to offer three reasons why atheists might embrace a more charitable approach. I am not suggesting that atheists do not have every right to offer pointed criticisms of religion – especially those aspects that have proven injurious to women, minorities, gays and lesbians, indigenous peoples, and “nonbelievers” – but much of the recent acrimony (it is at its worst in the blogosphere) is unwarranted and unhelpful.

One reason for a charitable approach is that we have little control over which beliefs are formed in us. Most arise involuntarily though sense perception, the testimony of people we trust, memories, and reasoning. Although we may on occasion be able to “will to believe” when the evidence for a claim is insufficient or inconclusive, generally beliefs are not formed in this way. For instance, I can’t will myself to believe that there is a jaguar in my office or that unicorns and trolls exist, no matter how hard I may try. Similarly, I can’t will myself not to believe that there’s a creek in my backyard or that my spouse has a mind much like mine. Even when we engage in extensive research on a subject about which we had few or no strong beliefs beforehand, the beliefs that arise toward the end of our journey do so almost automatically. The research itself may require an effort of will, but the beliefs that are formed as a result appear effortlessly.

If it is true that a majority of our beliefs simply impress themselves upon us, it would seem odd to hold a person responsible or blameworthy for the beliefs she currently has – unless, of course, it should be discovered that this person is consciously engaged in self-deception or wishful thinking, routinely violating the established belief-forming practices of her community, or stubbornly refusing to consider any evidence that might challenge her existing beliefs. But in the absence of these behaviors, and assuming that she is doing the best she can with the resources and knowledge available to her, I don’t think she can be blamed or held liable for her current noetic make-up. We might even go so far as to say she has the right to treat her beliefs as “innocent until proven guilty.”

Second, it is generally assumed that we should strive both to have beliefs that seem a good fit with “things as they really are” and to weed-out those that misrepresent the world. Put differently, belief formation and revision should be truth-seeking enterprises. I agree that truth-seeking ought to receive priority most of the time, but must it always? It certainly should in our respective areas of expertise. A physician, for instance, is obligated to adjust his beliefs about the human body so that they conform to the latest science, and a biology professor is obligated both to hold and to teach a form of the theory of evolution based on the latest and best data available. But outside our areas of expertise, I see no reason why adjusting our beliefs for a better fit with reality must always be our primary aim.

Say a young woman is drawn to the compassion a nearby community of Pure Land Buddhists exhibits toward all sentient beings and earnestly desires to have this same kind of expansive compassion too. Say also she reaches the conclusion that the surest way to “rewire” her neural pathways so as to acquire it is to adopt their way of life and their conceptual models of reality. Although she may initially find a few of their metaphysical claims implausible – for instance, call upon Amida Buddha but once in full sincerity and you will be transported at death into a “Pure Land” – she is convinced that over time her mind will become habituated at least to some of them. No one really knows what lies beyond anyway, she may think, and perhaps this community is right about the existence of a postmortem realm where adherents might continue their path toward liberation.