The Decline of Christianity

With all the talk of Christianity shifting to the southern contients, what are we to make of the so-called Christian West? It is true that missionaries are steadily sent from Africa, Korea, and China to re-evangelize the West. Is it also true that the West must necessarily decline in Christian faith due to the overwhleming evidence and advance of modernism? Rationality trumps faith, so Christianity declines?

In his book, The Death of Christian Britain: Understanding secularization 1800–2000, Callum Brown applies the Postmodern lens to trimphalistic secularisation. He contends that the West, Britain in particular, did not steadly decline due to secularization, but rather was a force to reckon with for a good part of the 20th century. His primary argument is based on non-emperical measurements of Christianity. Instead of relying on statistics, he examines the social discourse of Christianity in Britain, noting that self-identity and culture have been consistently shaped by Christianity, whether people are in church or not.

What of the U.S.? There seems to be a steady flow of Christian discourse in politics, science, literature, etc. Although no county has seen overall growth in the past few years, Christianity seems to still get press. Has Christianity been effectively banished by statistics or is there something to the lingring Christian social discourse?

See review here.