By Geoffrey Dean
This article is a simplified and updated version of the article "Undeceiving Ourselves" by Geoffrey Dean, Ivan W Kelly and Arthur Mather that originally appeared in The Skeptic Encyclopedia of Pseudoscience edited by Michael Shermer, ABC-CLIO 2002, Volume 1 pages 272-277. According to the Oxford English Dictionary, undeceive is a long-established word dating from 1598 with various derivatives such as undeceivableness. Its meaning is exactly what it says -- to free from deception, to reveal the truth to someone previously misled.
"What a piece of work is man! How noble in reason! How infinite in faculties!" said Shakespeare's Hamlet. And for simple everyday living, Hamlet was right. We really are a noble piece of work, otherwise we would not be here. But when things get complicated, as in deciding the truth of a strange belief, Hamlet was dead wrong. Nobody with infinite faculties would still be arguing after 2000 years about astrology or religion or how to bring up children. In fact, "How poor in reason! How limited in faculties!" would be nearer the truth.
Blame it on evolution
There were other consequences. Children became programmed to learn quickly whatever they were told. They had to learn -- and learn fast -- that fire burns and dogs bite, or they would not survive. Adults became programmed to act on what seemed like a connection or a pattern even if none actually existed. If shouting or praying during an eclipse was followed by the reappearance of sunlight, then people learned to shout or pray whenever there was an eclipse. The pattern was false but it seemed to work.
Making sense out of patterns
Is truth relevant? We like to think so, but society often sees truth or falsity as less important than believing. Faith is respected, skepticism is not. Disbelievers were once burned at the stake, and religion can still lead to war. It is faith, not reason, that kills, as happened at Jonestown in 1978 when 900 people died due to faith in their religious leader. Is logic (sound reasoning) relevant? Again, we like to think so, but logic in everyday life is often unrealistic. Nobody reasons logically to decide between strawberry and vanilla ice cream. And logic is often not justified anyway, simply because most errors are of little consequence.
In short, we are programmed by evolution to believe almost anything. What matters most is not truth or logic but content. Or, as Bertrand Russell said, what men want is not knowledge but certainty. For most of us, life becomes very difficult without the certainty provided by a belief system -- any belief system. Thus, one of the few valid generalisations in social psychology is the "principle of certainty", which says when there is evidence both for and against a belief, most people show not high levels of uncertainty, which would be the most sensible reaction, but high levels of certainty either for or against, which is not at all sensible. For them, it is better to be wrong than uncertain.
Effects of complexity
There is actually more to astrology than being a pseudoscience, for example studies have shown it may have merit as an ice breaker, or as a focus, for some clients receiving therapy by conversation, where its truth or falsity is of little consequence.
As a result, we tend to make judgements by assuming what we don't actually know (how many of us have actually tested our beliefs?), and by finding connections where none actually exist. So we are much less bothered by worthless data than we ought to be. All of these things are a legacy from our evolution. In fact, these things come so naturally that the problems were largely unsuspected until people tried to make a computer model of how we think. The problems arise when we want to find real connections and avoid mistakes, as when we first meet a strange belief.
To find connections where none actually exist, the only requirement is that our belief be established in advance (say, by reading about it), regardless of whether the belief is true or false. Suppose we believe that people with red hair have hot tempers. Most likely our experience of redheads is not clear-cut but rather vague, so our belief cannot fail to be confirmed -- we will see vague behavior as hot-tempered, and vaguely red hair as genuinely red. Truth or falsity will not come into it. If it seems ridiculous that your judgement could be affected by knowing the answer in advance, try making sense of this statement: "The trip was not delayed because the bottle shattered." The statement will seem vague and meaningless. But try again, this time thinking about christening a ship. The statement now seems crystal clear, and your belief that it is about a ship will seem amply confirmed.
But the statement is actually about dropping a bottle of Coke on a hiking trip. So your judgement of a vague and unclear behaviour was determined not by truth or falsity but by knowing the supposed answer in advance.
What if we have no prior beliefs? Here we can be led astray by another legacy from evolution, a potent learning process that occurs whenever something we do (accepting a belief, placing a bet) is followed by something else (feeling good, winning something). When the time interval is short, learning is automatic, and we can end up believing the two events are connected when they are not. Even worse, contrary to what we might expect, our belief becomes very resistant to change if the two events occur only occasionally rather than all the time. Thus occasional winning at roulette encourages further bets because we see that losing does not stop us winning, whereas fifty losses in a row persuades us to give up. Because two events can occur together just by chance, we can end up believing all kinds of things that are actually false.
Are symptom and disease related in these data? As shown below, the correct answer is no. But 80% of the nurses said yes, 7% said no, and the rest gave up. When asked to explain how they got their wrong answers, the majority of nurses said the most common combination was yes/yes, therefore disease and symptom were related. They had ignored the other combinations, which show the opposite -- the symptom is slightly more common among those with no disease (17/30 = 0.57 versus 37/70 = 0.53). Similarly, if asked whether redheads are hot-tempered or Leos are generous, hardly anyone considers even-tempered brunettes or tight-fisted non-Leos. Yet no link can exist unless redheads differ in temper from brunettes, and Leos differ in generosity from non-Leos. In short, no conclusions are possible without data for all four combinations (yes/yes, yes/no, no/yes, no/no). So be suspicious when believers consider only yes/yes combinations, as they usually do. For example they consider only predictions that come true.
It gets worse. Once we move from data on paper to data drawn from memory, we become subject to further judgement errors such as the following, largely because memory is a process of reconstruction rather than retrieval:
Vividness -- we focus on vivid things, ignore dull things
Now for the good news. We already have in place countermeasures against deceiving ourselves. They did not come quickly or easily, but they have been enormously successful. They are known as science. Or, as the Nobel prize-winning physicist Richard Feynman said, "science is what we have learned about not fooling ourselves". Of course not everyone can be a scientist, but everyone can benefit from the insights of science, which lead to the following strategies (in no special order).
Avoid emotional involvement. Remember that judgement errors are pervasive even though most people are unaware of them. Unless a claim is supported by a tally of confirming and disconfirming cases, you can safely assume that judgement errors are alive and well. Consider emotional involvement. Hell hath no fury like a cherished belief under attack. Which is more desirable -- feeling secure or being right? How much would it matter if your belief was wrong?
Ask questions. The aim is not to win but to learn. Ask believers in a strange belief the following questions: Why do you believe in it? This puts the burden of proof where it belongs -- on the claimant. What evidence would you accept as posing a problem for your belief? This is a potent question because it opposes the tendency to consider only favourable cases. Are there other explanations that could produce the same effect? This too is a potent question. Where did your idea come from? A credible source means the idea may be plausible even if the previous answers are unsatisfactory. Why should we believe in it? This restates the previous questions from our own viewpoint. For variants on these questions see Crash Course in Critical Thinking on this website under Classroom Resources.
Think about other explanations. Try to provide a plausible rival hypothesis. For example the Draw-a-Person personality test has been largely abandoned because the hypothesis "unusual person = inner conflicts" was displaced by the more plausible hypothesis "unusual person = lack of artistic ability." If you cannot think of a rival hypothesis, consider what might be the more plausible: X is true, or X is due to human judgement errors.
Should you keep an open mind? If X is possible but you have no evidence for or against, should you keep an open mind? The question is deceptive because the word possible is ambiguous. It can mean barely possible (if you jump off a cliff, it is possible that a freak wind will save you), or it can mean seriously possible (if you jump off a cliff, it is possible that you will die). Bare possibilities are vastly more numerous than could ever be studied, so only serious possibilities deserve an open mind. But an open mind requires us to tolerate uncertainty, which most of us find extremely difficult. Is the believer really open-minded? Be aware that believers use open-mindedness to frustrate criticism -- to them anything goes, which to them confirms their open minds, whereas requests for evidence indicates a closed mind. But their call for open minds is no more than a call to abandon all criticism. In effect it provides the glue without which their belief might fall apart.
For data snoopers. If you snoop around in data looking for something interesting, then your judgement errors are bad news. Try the following remedies: Graph the results so you can see what is happening. Test the findings from half of the sample on the other half. Compare your results with those of similar or nearly similar studies. Replicate on fresh data or, if fresh data is unavailable, on random data.
Be skeptical of skeptics. Unfortunately, some skeptics are as intolerant of contrary views as any committed believer. Their ploys tend to be as follows: (1) They keep raising the standards of evidence, or they find trivial flaws and claim they are fatal. Remedy: have them set the standards. (2) They deny the case simply because it is impossible or unlikely. Remedy: have them give reasons. (3) They make false claims such as "there are no cases of X", when in fact there are many such cases. Remedy: be informed. (4) They make accusations of incompetence or even fraud. Remedy: demand evidence. Point out that their argument is problematic if it leaves no room for people making honest mistakes.
Look for different agendas. In the paranormal area, skeptics tend to focus on whether X is true, like gravity, so its truth is the central issue. But believers tend to focus on whether X is meaningful or beneficial in some way, like Santa Claus, so its truth may be of little consequence. Beware the difference.
Learn about human judgement. Human judgement processes are an important area of psychological research. By 1970 there were more than 400 published studies. Today there are thousands including dozens of books. Two are outstanding and both have been pictured above. (1) Sutherland's book is very readable, nontechnical, many references), with examples every inch of the way. (2) Blackmore's book is the next best thing to doing your own research, very readable, nontechnical, hard to put down. Both provide a rich resource for undeceiving ourselves. For others see Magazines and Books under Classroom Resources.
Now for the bad news
In short, most people have no idea what sound evidence is. They have trouble even with basic reasoning, and are unable to provide sound evidence for their readily held opinions. The same is true of most applicants for James Randi's million dollar prize, see Million Dollar Prize on this website under Weird Things meet Critical Thinking > Undeceiving Ourselves. So, despite our best intentions, mere reading may leave us little better off. Set us before an enticing strange belief and our smug rules are gone in a flash. Fortunately this is not the end of it, for what matters is not so much rules as practice, motivation, feedback, and being cautious.
The key is practice, practice, practice
But strange beliefs are typically fuzzy and full of unclarity. So how can we achieve the required practice, motivation, feedback, and caution? The answer is to keep trying! Read the articles on this website to see how undeceiving yourself works, line up your favourite strange beliefs, and practice applying the questions given in Vrash Course in Critical Thinking under Classroom Resources. If you have a taste for foreign travel you can also visit the skeptic websites given in Links. For secondary students the WA Skeptics Awards will give you excellent hands-on practice -- and the lessons you learn by practising on strange beliefs will help you later in all areas of adult life. Finally, all of us can be encouraged by the increasing availability of skeptic works, both in print and on the Internet. Undeceiving ourselves was never meant to be easy, but it has never been easier than now.