I am concerned about how Christianity is perceived by those outside the faith. It’s not that Christians are ridiculed in the media (we’ve always been ridiculed), but rather, what people assume I mean when I call myself a Christian is so very diﬀerent from what I actually mean that I’m almost hesitant to use the word. I am not ashamed of the label, but there is so much misunderstanding that calling myself a Christian conveys almost the exact opposite of what I mean. I continue to use the word only because using another term only makes things even worse.
So, what do people think Christianity is? What do Christians mean when they say it? How did it get this way? And what do we do about it?