I have heard "debates" over whether America is, or ever was, a "Christian nation", pretty much ever since I've been active in a church. Initially, in the early 60's, I would have said "yes" to the question .. but for quite a few years, now, I've thought we weren't.
The founders would not have written the constitution the way they did, if they'd intended for us to be that
. They seem to have gone quite a ways to assure we would never be a nation tied to any
I've been involved in plenty of discussion in Sunday School about this, and have debated it more than I probably should have. But a couple days ago, sitting in my recliner, one thought came crashing in on me.
Let's say you and I want to start a church, so we do. Couple dozen people, maybe. And we draw up a constitution and a set of by-laws, and we specify that our "church" can never in any way prescribe what you, the members, must believe. We stipulate we must never pass any regulation as respects your faith. Or lack of faith. Buddhist, Christian, Atheist, whatever you want to be is OK with us.
Let's even say we had reasons to do so. Religion was crammed down our throats, parents made us go to their church, we went to Parochial schools, etc etc.
Could you ever, in your wildest imagination, think of referring to that as a "Christian" organization? Or as any kind of "church"?
I didn't think so. But that's precisely what the organizers of our country, Christians though they may have been, did when they drafted the Constitution and the Bill of Rights.
If God's been favorably disposed to the USA, it must have been for other reasons. Maybe as simple as our mandated freedom of religion. But whatever that reason might be, it's not that we're a "Christian nation".