Sunday, November 11, 2012

Why Believers shouldn't be too quick to claim America as a Christian nation...

I believe this applies to both Left and Right...the Right seems to have romanticized a period in time when America committed some of its greatest shames and the Left seems oblivious to the diluted form of Christianity they welcome in under the guise of Love.

I've reached a point in my life where I'm probably more exposed to White American culture, at least proportionately, than ever before. While I may have been more intimately entwined as a child, I do not have the large amounts of time spent in Black American church and community organizations that I did through high school and even in my predominately White college. If there is one thing I know, Blacks and Whites have a profoundly different relationship with America. As a Black Christian, I feel hesitation in claiming America as  a Christian nation--keep in mind that is entirely different than having a desire that it reflect Christian values or be blessed by God. I have no illusions that saying something is "Christian" changes anything in God's eyes. The phrase itself honestly never crosses my mind. I have no pains when someone claims that "America is no longer a Christian nation", and that's likely because I would have had to believe the phrase wholeheartedly to begin with. I know that the majority of the founders were, yes, Christian, but you have the secular humanist and deists of Jefferson and Franklin. Our founding documents refer to God, and, as far as I know, make no mention of Jesus. We can hardly claim 100% Christian lineage.

I sometimes think of how we look in the eyes of others--how could I proudly say to someone from a foreign nation...or from this one for that matter, that America has always been "a Christian nation" in the face of the atrocities of Slavery or the brutality of the American Civil War? The mistreatment of workers, children included, for the sake of business during our Industrial Revolution? Yes, every country has its failings and even the redeemed aren't perfect, but some churches' participation in, and silence during these events, is a different, deeper issue.  Plus, many a nation has seen prosperity with much less affiliation with Christ--I am no follower of the prosperity gospel and won't apply it any more to history than I will my personal life.

I really wonder if either camp, Left or Right understands what it really means to lazily utter blanket statements like "America is a Christian Nation" or "America is becoming more Christ-like", while all those who are not believers plainly see the atrocities of war, slavery, abuse and dissension that were going on internally during a time they revere so greatly--and also see how miserable and joyless we are growing in our prosperity and "freedom". I will not deny that America's best and most positively transformative leaders have been born out of the country's strong Christian roots--I find the story of Abraham Lincoln's parents' meeting awe-inspiring...I will not deny that the church has been instrumental, if not catalytic in combating some of the country's greatest sins--but it is careless and damaging to the faith to claim the ubiquitous presence and favor of God in relation to other nations. I do think though, that for me, it all comes down to the fact that I draw a harsh line between the concepts of "church-going" and "Christian", and that I believe that individuals only can be Christian.

I sometimes wonder if the evidence we see of decrease in humane behavior and our falling victim to our basest animal instincts is proof we are slipping away from our Christian roots, or just the manifestation that we were holding on to them tenuously in the first place, and that is all now being brought to light.

A passage we covered in Romans 2 tonight covered what I'm feeling suscinctly. 23 You who boast in the law, do you dishonor God by breaking the law? 24 As it is written: “God’s name is blasphemed among the Gentiles because of you.”

No comments:

Post a Comment