What is happening in America? Even though there are more Christian resources, ministries, and schools in America today than ever before, it only takes a brief look at our culture to see that it is becoming less “Christian” every day. And it’s not just America; it’s the entire Western world. Why is this happening and as Christians how do we respond? The Psalmist declares in Psalm 11:3, “If the foundations are destroyed, what can the righteous do?” Why is the Christian foundation of the West being eroded away and how is this related to Genesis? What is the relevance of Genesis in today’s world?