From       (3700 words 6 graphics)       Home       Fast-Find Index

Undeceiving Ourselves
How our thinking goes astray
and what to do about it

By Geoffrey Dean

This article is a simplified and updated version of the article "Undeceiving Ourselves" by Geoffrey Dean, Ivan W Kelly and Arthur Mather that originally appeared in The Skeptic Encyclopedia of Pseudoscience edited by Michael Shermer, ABC-CLIO 2002, Volume 1 pages 272-277. According to the Oxford English Dictionary, undeceive is a long-established word dating from 1598 with various derivatives such as undeceivableness. Its meaning is exactly what it says -- to free from deception, to reveal the truth to someone previously misled.

"What a piece of work is man! How noble in reason! How infinite in faculties!" said Shakespeare's Hamlet. And for simple everyday living, Hamlet was right. We really are a noble piece of work, otherwise we would not be here. But when things get complicated, as in deciding the truth of a strange belief, Hamlet was dead wrong. Nobody with infinite faculties would still be arguing after 2000 years about astrology or religion or how to bring up children. In fact, "How poor in reason! How limited in faculties!" would be nearer the truth.

Blame it on evolution
Blame it on evolution. We need more time to adapt to modern living. If human existence since the first use of tools two million years ago is represented by the height of a table, our entire recorded history is no thicker than a 20-cent coin, and the three centuries since the Industrial Revolution are as thick as a postage stamp. We humans are designed for a world that no longer exists, one where our survival depended less on reason and more on blind reaction. A movement in the tall grass might be a tiger or the wind, but running was safer than reasoning. A man seeking the truth by reason did not live long.

There were other consequences. Children became programmed to learn quickly whatever they were told. They had to learn -- and learn fast -- that fire burns and dogs bite, or they would not survive. Adults became programmed to act on what seemed like a connection or a pattern even if none actually existed. If shouting or praying during an eclipse was followed by the reappearance of sunlight, then people learned to shout or pray whenever there was an eclipse. The pattern was false but it seemed to work.

Making sense out of patterns
Progress became dependent on pattern recognition, the ability to make sense out of a blur of objects and people and situations. Without pattern recognition we would drift forever in a sea of possibilities so vast that the idea fire causes burns would be no more likely than the idea rain causes hunger. Indeed, pattern recognition became so important that we were driven to seek patterns even when none existed, as in seeing shapes among the stars or in the entrails of goats.

Scouts cartoon Then came language, which allowed the patterns (true
or false) to survive and thus influence future society.
Today, pattern recognition lets us tell one face from
thousands of others, and is why supermarkets advertise
their specials in pictures rather than words. It seems so
simple and easy. But underneath it all we are trying to
cope with modern living by using a brain designed for
survival in a quite different world. Consider what this
means for beliefs in general, see next.

Deceiving Ourselves
The programming by evolution that helps children survive also helps them accept fantasies such as Santa Claus. As time goes on, they avoid conflicting beliefs by becoming more selective and by asking questions. As a result, most of us end up with much the same beliefs as our parents and the community. What determines most of our central beliefs is not our gender or intelligence or personality but our upbringing. Nevertheless, when we get emotional as in anger or fear, our beliefs can easily bypass the reasoning parts of the brain. On a bad day this can leave us with beliefs that we are compelled to follow even though they make no sense, such as compulsive hand washing or a fear of open spaces.

Is truth relevant? We like to think so, but society often sees truth or falsity as less important than believing. Faith is respected, skepticism is not. Disbelievers were once burned at the stake, and religion can still lead to war. It is faith, not reason, that kills, as happened at Jonestown in 1978 when 900 people died due to faith in their religious leader. Is logic (sound reasoning) relevant? Again, we like to think so, but logic in everyday life is often unrealistic. Nobody reasons logically to decide between strawberry and vanilla ice cream. And logic is often not justified anyway, simply because most errors are of little consequence.

In short, we are programmed by evolution to believe almost anything. What matters most is not truth or logic but content. Or, as Bertrand Russell said, what men want is not knowledge but certainty. For most of us, life becomes very difficult without the certainty provided by a belief system -- any belief system. Thus, one of the few valid generalisations in social psychology is the "principle of certainty", which says when there is evidence both for and against a belief, most people show not high levels of uncertainty, which would be the most sensible reaction, but high levels of certainty either for or against, which is not at all sensible. For them, it is better to be wrong than uncertain.

Effects of complexity
As things get more complex, as in a pseudoscience like astrology, they generally become more uncertain. So we reduce uncertainty by slotting cases into simplified pigeonholes. That is, in conformity with the "principle of certainty", we opt for simple black or white rather than shades of grey. Thus most people either believe astrology works or they don't, with relatively few don't knows, yet how many will have actually tested astrology? . When information is lacking, we still use (invented) pigeonholes to fill the gaps. We even remember via pigeonholes, via black or white, thus distorting the original -- indeed, you just had an example of this where astrology was pigeonholed as a pseudoscience.

There is actually more to astrology than being a pseudoscience, for example studies have shown it may have merit as an ice breaker, or as a focus, for some clients receiving therapy by conversation, where its truth or falsity is of little consequence.

As a result, we tend to make judgements by assuming what we don't actually know (how many of us have actually tested our beliefs?), and by finding connections where none actually exist. So we are much less bothered by worthless data than we ought to be. All of these things are a legacy from our evolution. In fact, these things come so naturally that the problems were largely unsuspected until people tried to make a computer model of how we think. The problems arise when we want to find real connections and avoid mistakes, as when we first meet a strange belief.

To find connections where none actually exist, the only requirement is that our belief be established in advance (say, by reading about it), regardless of whether the belief is true or false. Suppose we believe that people with red hair have hot tempers. Most likely our experience of redheads is not clear-cut but rather vague, so our belief cannot fail to be confirmed -- we will see vague behavior as hot-tempered, and vaguely red hair as genuinely red. Truth or falsity will not come into it. If it seems ridiculous that your judgement could be affected by knowing the answer in advance, try making sense of this statement: "The trip was not delayed because the bottle shattered." The statement will seem vague and meaningless. But try again, this time thinking about christening a ship. The statement now seems crystal clear, and your belief that it is about a ship will seem amply confirmed.

But the statement is actually about dropping a bottle of Coke on a hiking trip. So your judgement of a vague and unclear behaviour was determined not by truth or falsity but by knowing the supposed answer in advance.

1727 woodcut In the 16th century your judgement
was determined by the Church, as
shown on the left. But things were
changing. On the right, on the other
side of the tree of knowledge, man
was making his own judgements,
pretending it was only to confirm
Church teachings. 1727 woodcut.

What if we have no prior beliefs? Here we can be led astray by another legacy from evolution, a potent learning process that occurs whenever something we do (accepting a belief, placing a bet) is followed by something else (feeling good, winning something). When the time interval is short, learning is automatic, and we can end up believing the two events are connected when they are not. Even worse, contrary to what we might expect, our belief becomes very resistant to change if the two events occur only occasionally rather than all the time. Thus occasional winning at roulette encourages further bets because we see that losing does not stop us winning, whereas fifty losses in a row persuades us to give up. Because two events can occur together just by chance, we can end up believing all kinds of things that are actually false.

Blackmore's Adventures of a Parapsychologist Our ability to make sense out of no sense is due to how we
think. Our acceptance of false beliefs, and of any supposed
psychic experiences, is the result of faulty thinking. The
experiences are real enough, but their origin lies internally in
our thinking, not in some real property of the observable
world. Yes, the parapsychologist Dr Susan Blackmore (a
former True Believer), didn't believe this explanation either.
But after twenty years of testing her beliefs one by one, she
has changed her mind. Her book (first published in 1986 by
Prometheus Books and since revised) is still in print.

Judging numbers
As it happens, we are quite good at things that require only counting. As marbles are drawn from a bag at random, we can estimate their average size or the proportion of red quite well. But once we start looking for links, such as between size and color, our ability disappears. For example, nurses were given the following data for a symptom and a disease:

Yes No
Symptom Yes 37 17
No 33 13
Total 70 30
Thus 37 diseased patients showed the symptom and 33 did not.

Are symptom and disease related in these data? As shown below, the correct answer is no. But 80% of the nurses said yes, 7% said no, and the rest gave up. When asked to explain how they got their wrong answers, the majority of nurses said the most common combination was yes/yes, therefore disease and symptom were related. They had ignored the other combinations, which show the opposite -- the symptom is slightly more common among those with no disease (17/30 = 0.57 versus 37/70 = 0.53). Similarly, if asked whether redheads are hot-tempered or Leos are generous, hardly anyone considers even-tempered brunettes or tight-fisted non-Leos. Yet no link can exist unless redheads differ in temper from brunettes, and Leos differ in generosity from non-Leos. In short, no conclusions are possible without data for all four combinations (yes/yes, yes/no, no/yes, no/no). So be suspicious when believers consider only yes/yes combinations, as they usually do. For example they consider only predictions that come true.

It gets worse. Once we move from data on paper to data drawn from memory, we become subject to further judgement errors such as the following, largely because memory is a process of reconstruction rather than retrieval:

Vividness -- we focus on vivid things, ignore dull things
Representativeness -- we focus on similarity, ignore actual occurrence
Stereotypes -- we use simplistic ideas, ignore actual observations
Sample size -- we ignore the huge sampling uncertainty of tiny samples
Overconfidence -- we tend to be overconfident in our judgements
Overload -- we cannot juggle more than about 7 chunks of data at once

Sutherland's book Irrationality There are many other causes of judgement error. For example in
his book Irrationality: The Enemy Within (Constable 1992),
the psychologist Dr Stuart Sutherland identifies no less than 100
causes of judgement error. He notes that our judgement is not
helped by the tedious effort required to avoid error as compared,
say, to the notable lack of effort required to recognise faces -- a
result of our visual system having developed first. We can easily
become blind to what really matters, so our judgement can be
wrong in ways we never suspect. The demands of modern ideas
have outrun our brains and minds. For examples just read the
articles on this website. We need help.

Now for the good news. We already have in place countermeasures against deceiving ourselves. They did not come quickly or easily, but they have been enormously successful. They are known as science. Or, as the Nobel prize-winning physicist Richard Feynman said, "science is what we have learned about not fooling ourselves". Of course not everyone can be a scientist, but everyone can benefit from the insights of science, which lead to the following strategies (in no special order).

Undeceiving Ourselves
Learn to change. The problem is simple: we are generally unaware of our errors, and we are generally overconfident about our judgements, so it will seem implausible that our reasoning could be faulty. Especially if we are believers in a pseuodscience, because no pseudoscience can tolerate genuine science and error-free reasoning. For example no sun sign astrology book could tolerate a listing of the many scientific studies that have failed to support sun sign claims. We therefore have little incentive to change. But change we must.

Avoid emotional involvement. Remember that judgement errors are pervasive even though most people are unaware of them. Unless a claim is supported by a tally of confirming and disconfirming cases, you can safely assume that judgement errors are alive and well. Consider emotional involvement. Hell hath no fury like a cherished belief under attack. Which is more desirable -- feeling secure or being right? How much would it matter if your belief was wrong?

Ask questions. The aim is not to win but to learn. Ask believers in a strange belief the following questions: Why do you believe in it? This puts the burden of proof where it belongs -- on the claimant. What evidence would you accept as posing a problem for your belief? This is a potent question because it opposes the tendency to consider only favourable cases. Are there other explanations that could produce the same effect? This too is a potent question. Where did your idea come from? A credible source means the idea may be plausible even if the previous answers are unsatisfactory. Why should we believe in it? This restates the previous questions from our own viewpoint. For variants on these questions see Crash Course in Critical Thinking on this website under Classroom Resources.

Think about other explanations. Try to provide a plausible rival hypothesis. For example the Draw-a-Person personality test has been largely abandoned because the hypothesis "unusual person = inner conflicts" was displaced by the more plausible hypothesis "unusual person = lack of artistic ability." If you cannot think of a rival hypothesis, consider what might be the more plausible: X is true, or X is due to human judgement errors.

Should you keep an open mind? If X is possible but you have no evidence for or against, should you keep an open mind? The question is deceptive because the word possible is ambiguous. It can mean barely possible (if you jump off a cliff, it is possible that a freak wind will save you), or it can mean seriously possible (if you jump off a cliff, it is possible that you will die). Bare possibilities are vastly more numerous than could ever be studied, so only serious possibilities deserve an open mind. But an open mind requires us to tolerate uncertainty, which most of us find extremely difficult. Is the believer really open-minded? Be aware that believers use open-mindedness to frustrate criticism -- to them anything goes, which to them confirms their open minds, whereas requests for evidence indicates a closed mind. But their call for open minds is no more than a call to abandon all criticism. In effect it provides the glue without which their belief might fall apart.

Psychiatrist cartoon Know the difference between beliefs and
Beliefs are just statements of opinion.
You are free to agree or disagree. But when
something is observed again and again under
error-free conditions, it is a fact. Facts are not
beliefs. You cannot simply dismiss them. To do
that you have to fault the way the observations
were made. To be good at this requires long
training to avoid errors of thinking, perhaps as
long as it takes to learn a second language.

For data snoopers. If you snoop around in data looking for something interesting, then your judgement errors are bad news. Try the following remedies: Graph the results so you can see what is happening. Test the findings from half of the sample on the other half. Compare your results with those of similar or nearly similar studies. Replicate on fresh data or, if fresh data is unavailable, on random data.

Be skeptical of skeptics. Unfortunately, some skeptics are as intolerant of contrary views as any committed believer. Their ploys tend to be as follows: (1) They keep raising the standards of evidence, or they find trivial flaws and claim they are fatal. Remedy: have them set the standards. (2) They deny the case simply because it is impossible or unlikely. Remedy: have them give reasons. (3) They make false claims such as "there are no cases of X", when in fact there are many such cases. Remedy: be informed. (4) They make accusations of incompetence or even fraud. Remedy: demand evidence. Point out that their argument is problematic if it leaves no room for people making honest mistakes.

Look for different agendas. In the paranormal area, skeptics tend to focus on whether X is true, like gravity, so its truth is the central issue. But believers tend to focus on whether X is meaningful or beneficial in some way, like Santa Claus, so its truth may be of little consequence. Beware the difference.

Learn about human judgement. Human judgement processes are an important area of psychological research. By 1970 there were more than 400 published studies. Today there are thousands including dozens of books. Two are outstanding and both have been pictured above. (1) Sutherland's book is very readable, nontechnical, many references), with examples every inch of the way. (2) Blackmore's book is the next best thing to doing your own research, very readable, nontechnical, hard to put down. Both provide a rich resource for undeceiving ourselves. For others see Magazines and Books under Classroom Resources.

Now for the bad news
Will reading books, articles, and this website about undeceiving ourselves improve our judgement skills? We might hope so, but research has shown that the improvement is small. It is easy to see why -- we can no more improve our judgement skills merely by reading about them than we can improve our tennis or driving skills. Our believe-anything-if-it-feels-good legacy from evolution is just too much to overcome by reading alone.

Kuhn's Skills of Argument In her book The skills of argument (Cambridge 1991), Professor
Deanna Kuhn interviewed 160 people from different backgrounds
to see what evidence they could give for their opinions on everyday
matters such as "what causes unemployment". The results were quite
unexpected. Most of the people gave what they thought was good
evidence, but it wasn't evidence at all, let alone good evidence. They
could cope with simple things like "without a ticket you cannot travel
by bus or train", but when asked why some children stay away
from school they had only opinions ("because they dislike school"),
not evidence (such as the results of interviewing 100 truants).

In short, most people have no idea what sound evidence is. They have trouble even with basic reasoning, and are unable to provide sound evidence for their readily held opinions. The same is true of most applicants for James Randi's million dollar prize, see Million Dollar Prize on this website under Weird Things meet Critical Thinking > Undeceiving Ourselves. So, despite our best intentions, mere reading may leave us little better off. Set us before an enticing strange belief and our smug rules are gone in a flash. Fortunately this is not the end of it, for what matters is not so much rules as practice, motivation, feedback, and being cautious.

The key is practice, practice, practice
We learn motor skills such as swimming and driving by practice and by learning from our mistakes. Swallowing water or hitting the curb gives us instant feedback on what to avoid. Undeceiving ourselves is basically the same process. Instead of moving our arms, we now have to move ideas, but the crucial component is no different -- we learn by practice and by making mistakes, as when a doctor discovers that a supposed stomach cancer is actually an ulcer. So to succeed in undeceiving ourselves we need constant practice, clear ideas to focus on, and clear feedback. A fuzzy idea means fuzzy feedback, so any unclarity is bad news. Imagine trying to learn if swallowing water or hitting the curb occurred at random.

But strange beliefs are typically fuzzy and full of unclarity. So how can we achieve the required practice, motivation, feedback, and caution? The answer is to keep trying! Read the articles on this website to see how undeceiving yourself works, line up your favourite strange beliefs, and practice applying the questions given in Vrash Course in Critical Thinking under Classroom Resources. If you have a taste for foreign travel you can also visit the skeptic websites given in Links. For secondary students the WA Skeptics Awards will give you excellent hands-on practice -- and the lessons you learn by practising on strange beliefs will help you later in all areas of adult life. Finally, all of us can be encouraged by the increasing availability of skeptic works, both in print and on the Internet. Undeceiving ourselves was never meant to be easy, but it has never been easier than now.

From       (3700 words 6 graphics)       Home       Fast-Find Index