-By Warner Todd Huston
As President Obama engaged in his “America Stinks” tour of Europe this week he told audiences in Turkey that the U.S. is not a Christian nation. “We do not consider ourselves a Christian nation,” he said on April 6. This echoes his statement in 2007 when Obama told CBN, “whatever we once were, we’re no longer just a Christian nation.”
The subtle difference between those two statements just over a year apart is interesting. Candidate Obama seemed to admit that we might have “once” been a Christian nation but are no longer “just” a Christian nation. But, suddenly as president, he seems to be saying squarely that we “don’t” consider ourselves Christian. Interesting that he seemed to feel obligated to mitigate as a candidate his now openly admitted belief that we just aren’t a Christian nation.
In any case, it is obvious that this is Obama’s way of ingratiating himself with Muslim audiences. But whatever his immediate goal, his sentiment is a popular one with Americans that sport left-wing, anti-religious ideology, people who look to Obama as their leader.
But is he right? Is it true that we aren’t a Christian nation? Did the Founding Fathers choose the Christian ethic as the one upon which they based this country, or not? The answer would appear to be an emphatic yes once the historical record is reviewed. It would also appear that we are straying far afield from that grounding.
What are we if NOT a Christian Nation?”