American Christianity

Words Mean Things — Including the Word 'Christian'

“The decline of white, Christian America is the end of a way of life”

The story of white Christian America is, among other things, a story of changing demographics, evolving attitudes toward organized religion, and an ensconced political establishment that some working class whites view as being lost at sea.

The Decline of White Christian America, Explained in 4 Books

1476 reads