Slightly off-topic. Much is often made in accusing Christians about their role in bringing Hitler to power that he made such explicit appeal to "evangelical" churches about the good morals he was going to instill in German citizenry, especially the youth, and how easily they lapped it up. This is one of those partial-truth statements that would qualify for a yellow light on snopes.
Evangelical does not mean the same thing in German and modern American contexts. In 1930 Germany, it meant Lutheran and Reformed. Very mainstream, and already misshapen by a hundred years of bad German theology, which culminated in German Christianity, a pretty clearly heterodox movement.
Second, that line of argument always seems to ignore that it was American (and British) Christian peace groups that led the charge to keep us out of war in the 30's, allowing Hitler to consolidate power, build weapons, and start killing Jews. See also the run-up to WWI.
I'm not arguing that the Christian church gets anything like full distance on this. I'm adding a necessary corrective to a popular accusation.