Many of us see the term Christendom and wonder what on earth it is, so let me give you a brief definition, some history, and begin a series of posts on how an understanding of it impacts the current state of the church and how we understand our faith.
A definition: "Christendom" is the name given to the religious culture that dominated Western society since the fourth century (from Michael Frost's book, Exiles)." Now, let me add a little bit to that. A way to understand Christendom is to understand Christianity as a state religion. Often many people see the United States as this way, but it goes back long before our country was even a thought.
When Jesus died there was a period of a few hundred years where there was great persecution of the early church. People were crucified, tortured, and killed for their faith. Christianity, in short, was not an acceptable religion by the larger culture's standards. It was subversive, revolutionary, and controversial. People died for their belief in Jesus Christ.
About 400 A.D. Constantine came to the throne of the Roman empire. I'm not a history buff, so I'm not sure of all the reasoning, but in short he made Christianity the official religion of the Roman empire. What used to be subversive, illegal, and revolutionary was now socially acceptable (expected) and the norm of the state. It became the state religion. Interestingly enough prior to Christianity being a state religion there was no participation of Christians in the military, but with the 'legalization' of Christianity the military was now full of Christians. Things changed drastically for the Christian faith.
That paradigm of Christendom where Christianity was a state religion has carried us up until recent history. Current writers such as Michael Frost and Gregory Boyd critique the current American situation and call it a post-Christian culture, and I would tend to agree. The age where the church was the central social gathering place and where the church was all seen to vote one way, etc...those days are gone, and I think it's for the better. What Christendom has done is make us socially acceptable. There is little to no need to be radical and subversive in a culture that accepts your religion as the religion. Where we are at now is a place where Christianity, by and large, is no longer the acceptable religious norm of our country.
The Christian faith was never supposed to be or intended to be taken over by any nationality. If you remember the words of Paul he said there was no longer Jew nor Greek, slave nor free, male or female...but all are one in Christ Jesus. If you look at the history of the Old Testament you see that Israel was condemned by Jesus for their nationalistic attitude as if they were a nation blessed above all others. They were blessed in order to be a light to the nations...set apart...not exclusive. This is what happens in state endorsed religion...it becomes exclusive.
Our current state of affairs, I believe likes us to that of the writings of the prophets and the early church when they were no longer socially acceptable but lived in exile (Walter Bruegammen and Michael Frost are great reads for this belief). It is a privilege to live in exile. Living in exile is nomadic in a sense. You don't conform to the culture that surrounds you but you live in distinct and Jesus like ways to demonstrate another Kingdom...that of the Kingdom of God.
This is our current situation. There is no need to wish for the days of old when everyone loved the Christian faith because I'm not sure it was the greatest representation of the subversive and radical ministry of Jesus we're called to follow. We're in a new situation where we're no longer the acceptable norm of religion, and that is good. It calls us to live the life Jesus commanded us to live in new and invigorating ways.
We'll continue to look at the affect of Christendom in the next few posts.