Saturday, March 29, 2014

The Mechanism For Extremism

Jonathan Haidt, and others, have shown experimentally that information which challenges our belief does not weaken our original theory, but fairly rapidly results in our holding the previous belief ever more tightly.  Village cells in communist China used to have an exercise in which the group attacked the Maoist premises of one of the members, forcing him or her to defend it.  I don't know if this was at all widespread, but I read about it in the early 70's. It supposedly made acceptance of doctrine even more complete among party members. 

We would hope that reasonable people would pause and reflect if counter-evidence is provided.  Apparently this is not so. The default human tendency is to double down on the old belief.

We hear this, we find it plausible, we see it in others and worry with some disquiet whether it is true of us as well. Yet really, how would such a thing work?  We can imagine evolutionary strategies where persistence of belief is advantageous, and I discussed earlier this week the advantage to the group that some be extremist.  There is advantage to the individual to be that sort of person as well, though it is perhaps a little harder to see it. If one thinks in terms of small hunter-gatherer and then villager groups it becomes easier to see.

But.  We also have a bias toward truth, toward reality.  Knowing what is really a danger, what is really a food, or really a friend has obvious advantages. We may delude ourselves quite a bit about whether our chief is really a good leader, and functioning as a loyal group works out for us even when the leader is pretty bad.  But there has to be a limit to that.

What is it that we tell ourselves in rationalisation, to allow ourselves to hold the challenged belief.  When we are presented with strong evidence that Bill Clinton is lying, how do we continue to support him?  What story to we tell ourselves to justify this?

I have some insight into this, not from a deep understanding of human nature, but from observing what takes place in my own mind.  I doubt my experience is universal, but I'm darn sure it's not unique, either, as I see it all around me. When challenged, we focus on the faults of the attacker. Well, yes, under ordinary circumstances we should lessen our support for a president who lies.  But these are parlous times, and the evil of his critics is so great that they must be stopped in their tracks at all costs.  Politics is a dirty business.  There are no perfect people after all.  We have no choice, really. Twenty-four hours of telling yourself that and you can go back to liking the guy again.

A friend in Romania was describing to me in 2000 that the choice in the election was quite literally between a fascist and a communist, whatever sweet words they said. (There are smaller parties, such as a Hungarian one that is big in Transylvania.) How do you mark the ballot for either?  Only by telling yourself ever more forcefully how bad the other guys are.

9 comments:

Anonymous said...

"We would hope that reasonable people would pause and reflect if counter-evidence is provided."

reasonable people don't disagree on the facts, but may or almost always do disagree on their interpretation.

and more often than not facts are taken as prima facie evidence for one theory or another. that is, that they are mere facts is missed.

the problem with the so-called "nature/nurture" debate is that neither side has a theory which could possibly correspond to the way the world is in fact. the "conceptual apparatus" is too simple.

terri said...

I have read this many times. I do wonder if this is a temporary effect. For instance, if one is confronted with evidence against a belief one time, in one particular confrontation, is the reaction different than if a person is confronted with the same evidence on multiple occasions and in different contexts?

Is there a tipping point where belief contrary to evidence begins to be undermined?

I think there must be, otherwise no one would ever change their deeply held beliefs.

terri said...
This comment has been removed by the author.
Assistant Village Idiot said...

I also think there must be, else no one would or could change. Your mention of different contexts may be important.

I will guess that the paying of some cost for the belief has its effect. Socially, we don't like to pay ostracism costs and may learn to keep our unpopular ideas to ourselves. Those might more easily wither.

It comes up in small ways. We have discussions over whether adults should pay for their Sunday School materials. On the one hand, the gospel is free, and any suggestion that it is not should be avoided. Then too, we don't want anyone to miss out because they don't want the embarrassment of admitting that the cost is a burden.

Against this is the repeated observation that if people purchase the book they are more likely to attend class, do the homework , and say afterward that they learned something. On a small, or perhaps as you say temporary level, paying a cost is strengthening to the belief. But what is disillusionment but finding that the cost was too high - that what you gave in heart or mind or wallet was not worth what you got back?

There is a turning point. I am guessing that there is variation between individuals and between cultures in what costs they will bear before the scales drop from the eyes. Yet there may not be as much variation as we initially imagine. Dictators and con men succeed because the behavior of large numbers of people are somewhat stable.

james said...

Perhaps one reason for clinging hard to old beliefs is the implications of the new.

If somebody explains to me that Czechoslovakia is now two separate countries, that doesn’t mean anybody lied to me: I just missed that part of the news and never paid any attention to middle Europe. There are no consequences to the correction. Similarly if you explain to the man in the street that it is the magnetic component of an electromagnetic wave reflected off a conductor that is parallel to the conductor, and not the electric field component, that’s no skin off his nose; he just was sleeping back in high school physics class. He may argue a minute for the sake of the integrity of his memory, but he has nothing invested in the issue.

But suppose you tell him that Brazil is really two distinct countries with different governments and that they’ve been at war with each other the past two years, and supplied adequate proof of the claim. That’s a big deal, because it means that sources he trusted have kept the truth from him, and probably not just about Brazil. Huge swaths of what he thinks he knows are at risk, including things he may regularly rely on and have other reason to believe. ("The same paper that writes about Brazil writes about Chicago, and I've been there and they were right.")

So, do I go with massive changes in my knowledge and rebuilding the database from scratch, or do I suspect that you are misled? Maybe I think it better to have a database tainted by one mistake than an empty one?

In politics similar revelations mean you have to choose between trusting someone you used to trust and trusting someone you can’t trust at all.

Anonymous said...

Politics has nothing to do with trust and rationality. It's pure team spirit.

MY band is strong! Our leaders are wise and good! We are wise and good because we are part of OUR band!

Your band is weak! Your leaders are corrupt and stupid! You are stupid and bad because you are part of YOUR band!

Just paste in some political party names and you've got 99% of contemporary discourse.

james said...

At lunch at CERN a few years ago, I was called on to explain how it could be that Bush had beaten Kerry, because everyone knew Kerry was better. I tried to point out that there were a lot of local issues they weren't attuned to, and that the sample they heard their news from wasn't representative. Conversation turned instantly dead after the latter. Technically they weren't on either team; they didn't think much of Kerry (he was an American, after all).

There is a huge emotional bond to the tribe, but I think there's an intellectual tie as well--a daunting barrier to changing your mind.

David Foster said...

The tendency to look for evidence supporting one's existing beliefs, and ignore evidence contrary to such belief, is not limited to political matters. "Confirmation bias" is well known to accident investigators.

For example, the NTSB report on the Comair flight that attempted takeoff from the wrong runway suggests that the First Officer's observation during the takeoff briefing that "the lights are out all over the place" may have led to the belief that they might also be out on runway 22...where they were supposed to be...rather than drawing the correct conclusion that the lights were out because the runway they were on wasn't the one to use.

Also, the FO commented as they rolled down the (incorrect) runway 26 that "[that] is weird with no lights” to which the Captain responded "yeah," rather than the only possible response that could have saved the flight, which was an immediate abort with maximum braking. At this point, with the airplane nearing 100 knots, an abort would have involved heavy deacceleration, major upsetness to the passengers, and possibly employment consequences for the flight crew. The more general point is that confirmation bias is probably strongest when the costs of changing one's hypothesis are very clear.

Of course, in this case, the costs of *not* changing the hypothesis were a lot higher.

Retriever said...

This post is excellent, thanks AVI. I need to chew over it awhile...A few immediate reactions, tho:
--First, I actually often find myself reacting vehemently for something or someone, in the face of mounting evidence that they are wrong or not what I had hoped, partially because I don't want to feel that I have been fooled again. On the one hand, I claim to be cynical, and not to believe in ANY politicians or leaders, and I am the most cynical and negative of all about church leaders right now. I don't like ANY pastors I know right now, except one older woman who is retiring...My point, tho, is that despite my avowed jaded, cynical attitude, I actually end up wanting to rely on and trust in somebody to rescue the country or the parish or the town or whatever. Perhaps because in my lazier moments I think that will get me off the hook a bit, and let me leave it to them. So I don't like the evidence of their feet of clay, and try (at first) not to listen to criticisms. A dramatic example came when my former spiritual mentor became a bishop over in Europe. I heard unquieting rumors that he was becoming more and more PC, but dismissed them because my former parish was so extremely conservative that I thought they were exaggerating. In point of fact, the guy ended up espousing atheism!!! Appalling.

What I find I do is then abruptly swing and react violently against the person. Resenting having been fooled/let down. Tho I should really just be mad at myself for being a jackass and being blind.

Second thought: we are all just so damn tribal initially. This totally influences our opinions. The good news is that most Americans (except perhaps in racist pockets) do not murder people over it. I think that the best remedy for extremism and the best way to change people's wrong headed opinions is not by argument or name calling or sensitivity training (like you, AVI, I work somewhere that forces us to endure VILE trainings that just teach people to dissemble and talk the talk while changing no thoughts or feelings). People change when they live and work with "the other". So, for example, I hate the abstract concept of people illegally entering my country, and stealing benefits paid for with my taxes, etc. , taking jobs so that my disabled kid and college aged kids can't find entry level work. But in practice, there are illegals everywhere I go, and when you worship and eat and sometimes work with someone you can't hate them as an abstraction any more. You can still be pissed at the unfairness of your kid not having a job and them not paying taxes and working under the table, but you can't demonize them. In practice, you have to learn to live together.

Some differences, tho, never get easier. I work around people who literally have no use for religion, most of whom talk about wacko pro-life types, and who see nothing wrong with divorce, or their kids getting drunk til they puke, hooking up, etc. Differernt values to my family. I try to live and let live, but I cringe at some of the things I hear.