Heuristics and Biases
Full Title: Heuristics and Biases: The Psychology of Intuitive Judgment
Author / Editor: Thomas Gilovich, Dale Griffin and Daniel Kahneman (editors)
Publisher: Cambridge University Press, 2002
Review © Metapsychology Vol. 6, No. 41
Reviewer: Max Hocutt, Ph.D.
This will be a perfunctory review of
a massive (857 page) work, but I lack the technical expertise to do a detailed
analysis of its contents, which are themselves too detailed and technical to
summarize.
Heuristics
and Biases is a collection of 42 reports, most by several authors, on a
program of research in cognitive psychology.
Started by Amos Tversky and Daniel Kahneman over three decades ago, this
research aims to identify the causes of what are said to be systematic errors
in reasoning. Inspired by observing the
fallacies committed by students in their courses in experimental design,
Tversky and Kahneman developed the hypothesis that much reasoning about
probability is guided not by formal algorithms but by “heuristics and biases”
that are more natural and intuitive.
Three simplifying heuristics—availability, representativeness, and
anchoring and adjustment—were identified at first. In the intervening years, these three have been redefined,
reconsidered and in some cases renamed., while other heuristics—attribution
substitution, outrage heuristic (I’m not making this up!), recognition heuristic, prototype heuristic,
etc.—have become candidates for addition to a growing list.
According to the editors, the appeal of this kind of
research derived from hope that it would enable psychologists to replace “the
classical model of rational choice” with a more accurate picture of human
reasoning under uncertainty. (p. 1) No longer would it be necessary to accept
the unrealistic idea that human beings are error free calculating machines that
follow rigidly defined algorithms specified a
priori. (Whether anybody ever
actually believed this is not discussed.)
This rationalist “model” could still be taken as a description of an
ideal and serve as a norm for human thought, but it could no longer be taken as
a description of actual human reasoning. Empirical research would now take a
new direction— showing that human beings are antecedently disposed to make
certain kinds of logical mistakes, not
randomly but systematically and predictably, according to determinate causes. A
whole new field of investigation lay open.
Contributions to the betterment of mankind —and careers—were to be made
tilling its soil.
The error that first got Tversky and
Kahneman (henceforth, T&K) going on this program was the so-called
conjunction fallacy. According to
elementary theory, the probability of p&q can never exceed either the
probability of p or that of q. Yet
T&K found that students given a choice between (a) “Jones had a heart
attack” and (b) “Jones had a heart attack and was over fifty five years old”
tended to assign a higher probability to the second statement. (p. 36)
To explain why their students had committed this error, T&K
theorized that they had not used the law of probability which they had been
taught. Instead, they had used what
T&K called a representativeness heuristic, meaning that they had picked out
the more typical, not the more probable, case.
Subsequent
discussion has made it clear that this interpretation is not proved by the
example. One possibility that does not seem to have been considered by T&K
at the time is that their students misunderstood their question. Well constructed multi-choice exams do not
usually offer overlapping choices of the form “p or p&q.” Instead, they offer mutually exclusive
choices of the form “p&-q or p&q.” Given that this is so, T&K’s
students may have thought that they were being asked to choose between (a)’
“Jones had a heart attack [and is not over 55 years old]” and (b) “Jones had a
heart attack and is over 55 years old.”
In that case, however, their error was not an error in logic but a
mistake in interpretation—one that was invited by a misleading question. So, grant that a mistake was made. Was it a mistake of reasoning, or perception? By suggesting that it was a mistake of
reasoning that is in many ways analogous to a mistake in observation, Kahneman
does not clarify this issue; he further obscures it.
On reconsideration of this and similar examples, T&K
acknowledge that their questions may have biased their results; but they are
not apologetic. Because no one hypothesis can explain all their examples, they
continue to believe they were on to something.
Let the fun continue. Besides,
the cat was now out of the bag. (p.
81) Literally hundreds of psychologists
were already pursuing the exciting idea that other people [besides themselves,
of course] are illogical, if not positively irrational. It is true that, in the course of carrying
out their investigations, these psychologists eventually changed their
tune. No longer determined to document
the thesis that most of us are idiots, they now favor a different
hypothesis—viz., that we human beings have what Daniel Gilbert calls a dual
system of reasoning. Faced with the
need to estimate probabilities, we first make a rough and ready judgment
following intuitive heuristics; then, if we have time and incentive, we correct
our initial judgment with more careful and accurate calculations.
One wonders why this common-sense observation
required confirmation by an elaborate program of experiments, but that has
always been the way of psychology.
Besides, as the editors emphasize, this theory fits with belief that
human perception and reasoning evolved under circumstances that required quick
evaluation of threats and opportunities. (You had to size up the thing as a
tiger before it could eat you!) It
also conveniently comports with the modular theory of the mind advanced by such
as Jerry Fodor, according to which different cognitive capacities are resident
in different parts of the brain. The
“dual system” theory further suggests a way to combine connectionist belief
that at least part of the brain is designed for pattern recognition with the
competing belief that at least part of the brain is designed for rule-based
calculations. In short, the “dual
system” theory is consilient with leading developments in evolutionary
psychology, artificial intelligence, and cognitive psychology. The trouble is that the dust has not yet
settled from efforts to test this theory.
As I suggested earlier, there is still room to dispute not just various
interpretations of the data, but the data themselves.
In the meanwhile, the work goes on, and if the
reader wishes to have a single volume containing the most interesting results
of that work to date, he could hardly do better than choose this one. As the back cover proclaims, “This book
compiles the most influential research in the heuristics and biases tradition
since the initial collection of 1982 (by Kahneman, Slovic, and Tversky.) The various contributions develop and
critically analyze the initial work on heuristics and biases, supplement these
initial statements with emerging theory and empirical findings, and extend the reach
of the framework to new real-world applications.” I would add that the various essays are generally well and
clearly written considering the complexity of the issues. So the book should serve well as a reference
work for researchers in cognitive science and as a textbook for advanced
courses in that difficult topic.
Philosophers interested in cognitive science will also wish to consult
it.
One small complaint: Desiring no doubt to offer as much material as possible in a
compact package while minimizing its costs, the publishers have chosen to
reduce the size of the type and to use glossy paper. After a time, the combination of the two produces eye strain and
headaches in old heads like mine. But,
of course, nobody will want to read such a book from cover to cover. Like other such volumes, this one is best
consumed a very small bit at a time.
©
2002 Max Hocutt
Max Hocutt, Ph.D., Professor of Philosophy Emeritus,
The University of Alabama
Categories: Philosophical