Our current President is fond suggesting Islam is a religion that promotes peace (as did G. W. Bush). Others, including many in conservative evangelical and biblical fundamentalist circles, insist that “real” Islam, according to the Quran, etc., is inherently violent toward all who do not embrace its belief system.
So what does “real” Islam teach about peace, jihad, and other human-rights topics?
Americans—especially Christians—really ought to stop trying to answer this question. We should also stop making generalizations based on what we believe to be the correct the answer. Here’s why.