What Do Christians Believe?


This is a vast question to which no brief answer can do justice.

However, at the heart of the Christian faith is a belief that God Himself came to live among us in the person of Jesus Christ. His birth, his life, his teachings, his death and his resurrection changed the world forever. Christians are people who invest a lifetime exploring what this means.

At the very least, it means this: that out of love for you and me, God did not stay in His heaven but

came to live among us ...

..... to know our joys and sorrows

..... to rescue us from our mess and regrets

..... and to show us life in all its fullness.