Author: Breezy Point Mom
•10:04 PM
...it was from a dear friend. She had passed on an email received from another lady (a stranger to me) that read:

Did you hear our President declare Tuesday that we are not a Christian country? Take a look at News week magazine. They have an article in todays issue “End of Christian America”. www.newsweek.com .

I don’t even know what to say.



So I decided to read it and see the reason for her relative speechlessness. My initial reaction was...

Did this lady read the article? Or was she just put off by its attention-getting title?
I read the article carefully, and found nothing offensive in it. In fact, coming as it did from Newsweek, I thought it was pretty well written. Nothing should come as a surprise to those of us who call ourselves Christians. The anti-Christian intellectualism of the 19th century eventually found its way into all branches of mainstream culture, and the ensuing cultural and moral shift should come as no surprise.
I think Christians should read the article thoughtfully, and consider whether Christ's desires for His church have been the modern church's desires for itself, and whether Christians, individually and collectively, have been living at the center of His will. His Kingdom is a spiritual one, after all. I think there are some Christians out there who feel that it is their mission to build a political Christian kingdom, even a theocracy. Maybe they suppose that believers would be able to live more comfortably in such an environment, relatively free from tension with their immediate world. But Christ never promised us this luxury on this earth.

I could spend more time writing a reaction to this article, but I'd rather not. I do, however, think the article is worth a careful, open-minded read.
|
This entry was posted on 10:04 PM and is filed under . You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.

1 comments:

On April 10, 2009 at 11:43 AM , Sandy said...

I thought the article was balanced and historically accurate. I think part of the problem we're having is that Christians have let churches and a handful of leaders define for them what a 'Christian America' means. They read a book, heard a sermon or watched a TV program and now they're convinced that America has always been politically Christian. The truth is that sometimes we were and sometimes we weren't. I do think there has been an obvious decline in recent decades of not only morality but of the general idea that Christianity is good for society. People are much more likely today to see Christians as the bad guys, mainly because we still assert that some things are sin and should never be done. And while we rail against the demand that we be 'tolerant' in the public sphere, church members demand the same tolerance at church that we don't want to give out anywhere else. We want a Christianity that feels good, and if we don't get it, it was the politicians fault. How very convenient. I'm afraid the Church has a great big plank in her eye.