The Kingdom of God

Why is there so little teaching about the kingdom of God in the west? What there is (various forms of Dominionism, the Social Gospel and even strains of Prosperity teaching) do not come close to that which we see in Christ and in persecuted places around the planet. It is interesting in that while Christ spoke more about the kingdom of God than He did about hell, heaven, money or faith; it seems that modern day “church” talks about all those things to the near denial of the kingdom of God. Why do suppose that is so?

Discussion