Saturday, July 22, 2006

On True Believers and self-delusion

What makes some people believe in the truth of supernatural and psychic phenomena, even when the alleged medium self has admitted that the events were fraudulently staged?
In face of overwhelmingly persuasive evidence (as viewed by an outside observer) to the contrary, why do some cling on to these beliefs?
The term true-believer syndrome was first used by former psychic M. Lamar Keene in his 1987 book The Psychic Mafia (published by Prometheus Books). True Believer as a term predates the book. The concept embodies the two questions opening this post.
Although not a generally accepted psychiatric term (as it clearly lacks a strong body of knowledge), it describes an apparent breakdown of reason and surrender to an effort to minimise cognitive dissonance through post hoc rationalisations.

An article on true-believer syndrome in The Skeptic's Dictionary [1] makes clear:
The idea of true-believer syndrome does not shed light on as to why people hold on to erroneous beliefs "Since by definition those suffering from true-believer syndrome are irrationally committed to their beliefs, there is no point in arguing with them. Evidence and logical argument mean nothing to them. Such people are incapable of being persuaded by evidence and argument that their notions are in error".

It is therefore questionable whether one can use this term other than to name people True Believers, one in my opinion apt word of abuse for those who believe for belief's sake.
One can safely say that in many cases the comfort that is followed by the will to believe is much more appealing than the distress of having to revise a clearly mistaken notion. The post hoc rationalisations that are taking place include the assurance that even if a person has confessed to fraud, the ability demonstrated still might be genuine. That is, a person could have psychic powers without consciously knowing about them, as was claimed by the true believers of James Randi's infamous Carlos Hoax [2].

The perhaps most distressing example of true believers is when they flock around a charismatic guru, effectively partaking in a cult. As is the case in cults, many spoons of dogma is willfully forced down the believers' throats in accepting the infallibility of the guru.
When a cult follower is subsequently "deprogrammed" and realises the utter foolishness in which they have engaged, a slow reconstruction of memory is not unknown to occur. Being asked about life as a cult member, the reformed one often demonises the leader, explains the unbearable living conditions, and expresses fear to be the main reason for staying in the group.
I believe that in many cases the new "me" in light of current understanding interprets the past in a new way. If the circumstances were that terrible, I find it hard to think why anyone would stay even a minute longer than they felt they had to. Rather than fear, the feeling of belonging to something special might lead some to stay in a group that their visceral intuitions know to be unhealthy, the social nature of human animals being that ingrained.

None of this explains why people like to be deluded, by themselves of by others, but ultimately themselves. Wishful thinking does in my opinion explain a lot, but not everything. There is a sea of cognitive biases, or mistakes of reason, a few of which are elegantly addressed in Thomas Gilovich's insightful book How We Know What Isn't So. The study of cognitive biases, and cognitive science in general, is fairly new and I find it incredibly interesting.

Where do we go from here? It is clear that people tend to be more willing to believe in something that they wish to be true. On what evidence, when blind faith won't do, does one ground the assertion that a certain belief is true? This is where the confirmation bias [3] comes into play - one tends to seek out evidence that support a preconceived belief, and disregard or rationalise any piece of evidence that negates it. If you, for example, are awfully certain that your wife is cheating on you, you probably will find evidence, short of catching her red-handed, that reinforces said suspicion. It's also not unheard of that undue distrust can incite the dreaded infidelity. I think Mr. Reed previously concluded that we believe what we wish to be true or what we fear to be true, so is true in this case.

The strength of the illusion is perhaps if not the main, most likely one of the key factors. Let's face it: Reality is a bitch, and we seldom get what we want. That's a good starting point. Nevertheless we struggle, the beautiful fight that is our daily lives, and sometimes, sometimes, we get what we want. Few things come easily, but when we ultimately get them, it makes it all worthwhile. That's why I think that it's an awful waste of time in holding erroneous beliefs, since they cannot possibly bear as much fruit as a properly informed views have the potential to - you have simply chosen the wrong path.
Humans were not designed to seek out truth, we have merely out of necessity evolved a need to find and explain patterns in our world, the understanding of which helped us survive. If those patterns and the conclusions that we derive from them are false, they can thankfully, when having an open mind, be corrected. The ability that through abstractions know the world and the way it operates is a source of joy for many, including me. That's why I find it pivotal to always be open to new ideas, and when they are proven beyond reasonable doubt, adopt them. That is why self-delusion is a sad practise.

[1] The Skeptic's Dictionary entry on true-believer syndrome
[2] Ibid. "Carlos hoax"
[3] Ibid. "Confirmation bias"

2 Comments:

Blogger J.P. Reed had the audacity to say...

Yet again I can do nothing but agree, dear brother.

Oh, and as the head of the committee I vow to put this matter to a swift and decisive closure.

16:53  
Blogger J.R. Libel had the audacity to say...

I would expect nothing less from you

20:02  

Post a Comment

<< Home