Jonathan Haidt, and others, have shown experimentally that information which challenges our belief does not weaken our original theory, but fairly rapidly results in our holding the previous belief ever more tightly. Village cells in communist China used to have an exercise in which the group attacked the Maoist premises of one of the members, forcing him or her to defend it. I don't know if this was at all widespread, but I read about it in the early 70's. It supposedly made acceptance of doctrine even more complete among party members.
We would hope that reasonable people would pause and reflect if counter-evidence is provided. Apparently this is not so. The default human tendency is to double down on the old belief.
We hear this, we find it plausible, we see it in others and worry with some disquiet whether it is true of us as well. Yet really, how would such a thing work? We can imagine evolutionary strategies where persistence of belief is advantageous, and I discussed earlier this week the advantage to the group that some be extremist. There is advantage to the individual to be that sort of person as well, though it is perhaps a little harder to see it. If one thinks in terms of small hunter-gatherer and then villager groups it becomes easier to see.
But. We also have a bias toward truth, toward reality. Knowing what is really a danger, what is really a food, or really a friend has obvious advantages. We may delude ourselves quite a bit about whether our chief is really a good leader, and functioning as a loyal group works out for us even when the leader is pretty bad. But there has to be a limit to that.
What is it that we tell ourselves in rationalisation, to allow ourselves to hold the challenged belief. When we are presented with strong evidence that Bill Clinton is lying, how do we continue to support him? What story to we tell ourselves to justify this?
I have some insight into this, not from a deep understanding of human nature, but from observing what takes place in my own mind. I doubt my experience is universal, but I'm darn sure it's not unique, either, as I see it all around me. When challenged, we focus on the faults of the attacker. Well, yes, under ordinary circumstances we should lessen our support for a president who lies. But these are parlous times, and the evil of his critics is so great that they must be stopped in their tracks at all costs. Politics is a dirty business. There are no perfect people after all. We have no choice, really. Twenty-four hours of telling yourself that and you can go back to liking the guy again.
A friend in Romania was describing to me in 2000 that the choice in the election was quite literally between a fascist and a communist, whatever sweet words they said. (There are smaller parties, such as a Hungarian one that is big in Transylvania.) How do you mark the ballot for either? Only by telling yourself ever more forcefully how bad the other guys are.