Is Christianity really a western religion?

United States
March 19, 2007 1:55pm CST
Christianity has been related to western culture. But in reality, is Christianity really a western religion? Even though Christianity is a dominant religion in the west, Christianity was started in the middle-east. Why is it that some people refer to Christianity a western religion (outside of the fact that the United States was founded by Christians who wanted religious freedom, and that Christianity is a top religion in the States)? I don't know why people say it, or are they just plainly ignorant about the historical aspect of Christianity. What are your thoughts?
No responses