“The decline of white, Christian America is the end of a way of life”

The story of white Christian America is, among other things, a story of changing demographics, evolving attitudes toward organized religion, and an ensconced political establishment that some working class whites view as being lost at sea.

The Decline of White Christian America, Explained in 4 Books

1804 reads

There are 3 Comments

Bert Perry's picture

Somehow titling a series of books about the "decline of white,  Christian, America" bothers me.  Don't get me wrong--I love and appreciate the various european cultures that have come here, and I'll often be found at any number of ethnic festivals or restaurants, admiring the foods, clothes, and more of each culture.  Somehow, though, the desire to describe very diverse people simply by their pallor bothers me.

Aspiring to be a stick in the mud.

Susan R's picture

EditorModerator

I find the blatant hypocrisy of our society--especially of the mainstream media and outraged minority groups--ironic. Every day it's another sermon or protest about equality and how everyone should be valued regardless of ethnicity, gender, or socio-economic background. And the next breath is blahblahblah about privileged white making reparations and rich people as predators and women as a superior gender.

I'll grant that there's been hundreds of years of white guys being in charge of just about everything, but religion and racism aren't the only reasons for this. It's like explaining how a light bulb works by saying "You flip the switch and the light comes on." Yeah--after about 500 other incredibly complex things happened first.