Here’s the big question: Is the word “evangelical” losing its theological anchor? If you ask the theologians, an evangelical is someone committed to such things as personal conversion, the authority of the Bible, the cross, and evangelistic-missionary efforts. If you ask the American populace, however, the word “evangelical” means Religious Right, it means fundamentalism, it means taking America back for God (the fear of Constantinianism: Church rules the State), and it means intolerance. This popular perception concerns me deeply.

Has the meaning of this once-broad term often used for those who were post-fundamentalist changed? Does it have any use for those of us who are evangelical moderates or progressives? Has the term been hi-jacked by neofundamentalism?
I’ve been reading some of Hauerwas recently and his essay “On Being a Christian and an American” resonates with this concern I am hearing about the capitulation of far too many evangelicals to being identified by political views. Hauerwas speaks of his concern “to make the world the world” and that “Christians have no service [to their country] more important than to be a people capable of the truthful worship of God” (25).
Christians should not he said adopt a stance of “mediating language” when they enter the public square. Far too many, he says, are concerned with “making America work” and speaking of “the story of America in which Christians get to have a role.” This story, the one that dictates both the Religious Right and Liberalistic Left, tells a story “meant to make our God at home in America.”
Instead, he asks, why don’t we tell “the church’s story of America”?
“I believe, therefore, Christians can do nothing more significant in America than to be a people capable of worshiping a God who is to be found in the cross and resurrection of Jesus of Nazareth.”
Quotations of Hauerwas are from his book, A Better Hope, chp. 1.
More from Beliefnet and our partners
Close Ad