I hear people say something like this all the time; “This is a Christian Nation, we need to keep it a christian nation”. I have a question about that….

When was this country or this continent ever governed by Christ or christians?  How was this country ever classified as Christian?

I know that there is a sect of christianity that has a different reference to Christ and this continent and I am not educated about that sect enough to address that but what I honestly want to know is what period since the first “settlers” came to America, was this a christian nation? Was it when we (us anglos) killed native/indigenous peoples or when we pushed them to barren land or dangerous wilderness? Was it when we supported the slave trade and owned other humans? Was it when we freed the slaves?… try again because in what sense were they free? They couldn’t vote, own much property or even mix with other races. Was it when we finally gave women the right to vote and almost started treating women equally? Was it when we didn’t allow jews and African Americans to attend our educational institutions with us? Was it when we rounded up Japanese Americans and put them in internment camps? Was it when we paid women less for the same work as men? (oh wait, that still happens).  Was it when we complained about the alien in this country who is treated like the outsider? (no wait, we still do that too) Was that when we hated on an African American who was running for president and we made racist remarks and slandered his name? (nope, wait, still doing that)

Are you there with me yet? What point in time, exactly, were we a christian nation in America? What has this country done that would show other countries who Christ is?

Politics and religion are seperate things. Of course the US needs unity or the “american religion” as Greg Boyd pegs it in his book Myth of A Christian Nation. That unity, though, is falsley staked as christian. That is not the case. It’s ok to love your country and want to represent it well but you can not legislate a religion to the people. That was part of the purpose of those who came to settle in America. These people wanted to be completely detatched from the government.

Christ is the example to the christians on how to live. Christ didn’t run for office, he didn’t address how to “run” politics. He did say “we are not OF this world” and that being the case, christians are not of politics. That doesn’t mean I i think christians should stay out of politics. I do believe, we as a nation have a civic duty to serve in a human way. It will not be perfect but it’s the best we have. Government was originally created in the United States to deal with foriegn diplomacy and handle the finances of the country. It has turned into this big mill of programs, spending, and legislating the different agendas of various groups, including christians.

To me, it seems the better thing to do as a christian is to put more effort into showing others who Christ is by showing compassion to your neighbors as well as feeding the hungry, clothing the poor, giving them some gas money when they can’t afford to get to work, watching a couples child so they can reconnect on a date, giving a ride to the mom who walks her kid to school every day and on and on.

The christian nation exists on a different level. Not as a country on a map but a kingdom in your heart. That is the only thing individuals control. Serve that kingdom as well in a different way than you serve America.

Advertisements