Christians risk becoming utterly irrelevant in their own culture if they continue to seperate people into "We the Saved" and "They the Damned". Again, I ask, do we need Jesus to protect us from God? Is that what Christianity as we've known is about? Are we saved from God by God?