We read in Acts 11:26, “And the disciples were called Christians first in Antioch.” Some commentators say the disciples themselves coined the term. Others say enemies of Christ invented the term as a derogatory title. In any case, the core meaning of the word “Christian” is “one who belongs to Christ.” What has happened to Christianity in the US and around the world?
But we cannot deny that today in the United States and in some other countries, Christianity has lost Christ. We use the term to mean someone who is NOT Muslim, NOT Jewish, NOT… whatever. Or we use it to mean someone who was born in a “Christian” country, whatever that means. But many who claim the name Christian certainly do not exhibit qualities of Christlikeness. So what has happened?
Tim Keller Explains
In his book Christ the King, Timothy Keller explains it this way. He’s referring to an interview with Andrew Walls who explains why the “center” of Christianity shifts:
“Walls hinted that when Christianity is in a place of power and wealth for a long period, the radical message of sin and grace and the cross can become muted or even lost. Then Christianity starts to transmute into a nice, safe religion, one that’s for respectable people who try to be good. And eventually it becomes virtually dormant in those places and the center moves somewhere else.”
Does that describe you? Does that describe your church? If so, we should heed God’s call in 2 Chronicles 7:14: “If my people, who are called by my name, shall humble themselves, and pray, and seek my face and turn from their wicked ways; then will I hear from heaven, and forgive their sin, and will heal their land.”
One word of caution. You must begin with yourself. You cannot honestly ask God to heal your nation, your church, or your neighbor, if you do not first submit yourself to Him. And when you do, He is “faithful and just to forgive” (1 John 1:9).