Mistakes Were Made (But Not by Me)
Full Title: Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts
Author / Editor: Carol Tavris and Elliot Aronson
Publisher: Harcourt, 2007
Review © Metapsychology Vol. 11, No. 31
Reviewer: Daniele Procida
There is a vast body of literature on how to do well, how to be happy, what to do and choose for one's own benefit and that of others. This body covers a range from the vulgar to the great moral philosophers. We are not short of such analyses or guidance.
In contrast, the body of work which considers our failure to do well and be good is decidedly smaller, and also, it must be said, rather lamer, particularly in its power to explain why we fall into foolish beliefs, make bad decisions and commit hurtful acts. We remain opaque to others and to ourselves, thinking, acting and responding in ways which are harmful, counter-productive and baffling. Most baffling of all is our propensity to continue in these patterns, to compound error with error and throw good vigorously after bad.
Attempts at explanation tend towards exasperated (and inadequate) conclusions of egoism, stupidity or evil, or contentious structures of historical, social or psychological theory to provide some sort of answer. Mistakes Were Made (But Not by Me) offers an alternative to these by describing the workings of a simple process, one which by its nature is hidden from our view. This process is self-justification, and it is driven by an engine of cognitive dissonance, the discomfort we feel at the gap between our self-image and the less attractive reality that sometimes confronts us.
It works like this: I do something that I should not have done, and this troubles me, because I'm not the kind of person who does that sort of thing. Redressing the mistake will be even more painful or difficult than not committing it in the first place would have been. So, to salve this nagging complaint of the soul, I declare to myself that the act was the right one all along, and I confirm this by reinforcing it at the earliest possible opportunity.
So, although I hate it when people treat menials rudely, I fail to speak out when the boss I'm very keen to impress humiliates and bullies the waitress. But I'm not the kind of person who is too weak to stand up to injustice! A gulf yawns between self-image and reality, the dissonance is unpleasant and unsettling. Harmony will most easily be restored, the gap most painlessly closed, by retelling the story. "Heh heh," I chuckle at the boss's nasty joke, adding: "She's lucky you're not the kind of guy who'd want to get her in real trouble for that!" And the next time the unfortunate waitress approaches the table I might even try out a little sarcastic remark of my own.
Or: I'm not the kind of person who allows smooth-talking salesmen to get the better of him, but somehow I seem to have ordered 1000 leather business cards; naturally, I'm troubled by the dissonance. There's no way out of this without having to admit to myself and others that in fact I am all too susceptible to basic techniques of persuasion, so instead: the very next time someone presents me with their own card, I am overcome with scorn for the pitiful rectangle of paper I've just been handed.
This simple process, argue the authors, explains a great deal of our ability to make mistakes and continue making them. According to dissonance theory, what we should expect from someone who finds themselves in a hole are not effective efforts to climb out, but energetic digging in the wrong direction. The analysis is counter-intuitive, but this is an entirely appropriate one for human behavior which itself seems to defy commonsense, and to resist explanation along the lines of more straightforward thinking.
Carol Tavris and Elliot Aronson have made a genuinely illuminating contribution to the study of human nature, one positively brimming with intelligence and insight. It is rare, in the twenty-first century, to be presented with a complete framework of explanation built around so simple an idea. To describe it as a book of a single idea would not be an exaggeration, but it would not be a criticism either: it is a pleasure, for once, to be invited to consider such a bold and confident offering, and a concept able to sustain such explanatory weight.
The authors bring their analysis to bear on anecdotes, history, current affairs and psychological experiments. Throughout, points are illustrated with examples covering foolish and harmful behavior on the scale from the personal to social, in a variety of different contexts and across a range of human activities. Some are just amusing (for the spectator, at least), but many are alarming, some are tragic, and some are horrifying.
In all of these, there are two essential aspects of the process which bring people to do and continue doing harmful things, all the while justifying it to themselves. Firstly, there is the cognitive dissonance which so effectively drives it, leaving them in urgent need of something that will smooth the troubled waters. Then there is the spiral of self-justification, which feeds itself: under its spell they take steps which themselves require further justification.
In addition, the authors identify other mechanisms which contribute to its operation in different contexts. There is the blind-spot of our own prejudices: we not only fail to recognize some important truth about the world, but (more significantly) will never on our own see that our blind spot even exists.
The biggest blind spot of all falls over our own integrity. It is what permitted Dr Andrew Wakefield to accept large sums of money to conduct research on autistic children, from lawyers representing their parents, and then fail to disclose the fact to the Lancet when it published a paper by his team reporting a correlation between autism and childhood vaccination. The paper has been discredited, Wakefield is currently facing a General Medical Council hearing for professional misconduct and dishonest behavior, and his reputation hangs in the balance if it is not already destroyed.
Wakefield continues to maintain his faith in the paper and his actions, resolutely denying that a conflict of interest existed. We do not need to demonize him, or even think him a liar. At some point he needed to reduce the dissonance between his self-image and his actions – how could an researcher of independence and integrity accept a large sum from lawyers? how could he not disclose this to the editors of the journal? — and justification would have set to work: Of course the money won't affect my judgments – my professional integrity will see to that! Of course I would have disclosed a conflict of interest — but obviously this wasn't. And he would genuinely and sincerely believe this.
Our intuitions, commonsense and other people warn us to beware of evil people trying to do wrong. They are mistaken. We should be most wary of decent people, people like us, like Andrew Wakefield, who are sure they are doing right.
It is this last insight which we can most usefully take to heart. We too believe we are doing the right things, that we are justified in our actions – because we too are driven to justify to ourselves the things we do, and because we too have a blind-spot which hides this from us, and we too will resist being confronted both with the blind-spot and what it hides.
Wakefield's blind-spot has left him facing the prospect of disgrace. Few are so unfortunate, but our blind-spots are always ready to lead us into anything from looking corrupt to looking ridiculous:
When you enter the Museum of Tolerance in Los Angeles … you watch a video on the vast variety of prejudices, designed to convince you that everyone at least has a few, and then you are invited to enter the museum proper through one of two doors: one marked PREJUDICED, the other marked UNPREJUDICED. The latter door is locked, in case anyone misses the point, but occasionally some people do. When we were visiting the museum one afternoon, we were treated to the sight of four Hasidic Jews pounding angrily on the Unprejudiced door, demanding to be let in.(41)
Another characteristic aspect of the business of justification is what the authors call the "pyramid of choice". Our first step out of dissonance is usually a tiny one, and leaves us only a short distance from where we were before. But we have started a slide, and the next step will be in the same direction, and the next, until we find ourselves at the bottom, far from where we started.
This process blurs the distinction that people like to draw between "us good guys" and "those bad guys". Often, standing at the top of the pyramid, we are faced not with a black-and-white, go/no-go decision, but with a gray choice whose consequences are shrouded. The first steps along the path are morally ambiguous, and the right decision is not always clear. We make an early, apparently inconsequential decision, and then we justify it to reduce the ambiguity of the choice. This starts a process of entrapment — action, justification, further action – that increases our intensity and commitment, and may end up taking us far from our original intention or principles.(34)
Had Wakefield refused the lawyers' money, he might now be at the bottom of his pyramid on the other side, a furious puritan of clinical ethics, disgusted and baffled by his peers' willingness to compromise their integrity. "It's the people who almost decide to live in glass houses who throw the first stones." (33)
Wakefield's story is a cautionary one, but others are stranger and more astonishing. The authors trace the work of justification in memory, beginning with mostly harmless personal anecdote and then moving through accounts of confabulists like Bruno Grosjean, whose feted memoir (Fragments) of Holocaust survival as a young boy turned out to have been a work of pure invention. Grosjean was not in fact Jewish, nor had he ever set foot in a concentration camp. But while his story is completely false, he is not a liar, nor mentally ill, and he is not alone in inventing and believing – genuinely and sincerely – an extraordinary personal history.
Over a period of more than twenty years, step by little step, Grosjean's memory, "the self-justifying historian", worked to fill in elusive, troubling gaps in his early past. Anything he came across that could be pressed into service by his memory to assuage his uneasy self-image was made use of. Fragments of the Holocaust and survivors' accounts became his own; imagination brought them to life, in irresistible, veridical, undeniable detail. Every step was a self-confirming one, and each one commissioned the next.
Memory, unreliable and self-serving, gave an unhappy middle-aged man a past which solved neatly the painful riddles of his own life. It caused a publisher, scholarly associations and Jewish organizations a great deal of embarrassment. But these are trifles compared to the havoc it has wreaked in the hands of the "repressed memory" industry, and it is at this point that justification reveals itself as a truly frightening power. The authors describe how the lives of the unhappy and needy are vulnerable to exploitation, not by the unscrupulous and greedy, but by people who are convinced that they are doing right. Therapists, convinced that they can uncover hidden memories of childhood traumas, offer clients a solution to their mysterious unease, and once again, step by step, client and therapist "uncover" increasingly disturbing incidents from the past, implicating family and friends in a history of abuse. The first step, and the feelings of relief and anger that come with it, are small ones, but each leads to the next, until the bottom of the pyramid is reached, and lives are irreparably destroyed by these false and sincerely-held beliefs.
Most terrifying of all are the chapters on clinical investigation into child abuse and on criminal investigation, where a simple, so-easy-to-start process of justification persuades investigators that they are right, swiftly covering over self-doubt or skepticism. Here short-cuts and dubious methods, which not even the perpetrators could condone if they were not engaged in them, are justified because they lead in the direction that justification has already taken them, and this is why investigations into investigations have uncovered an appalling litany of abused process and wrongful conviction.
Again, this is not the work of those bad/lazy/corrupt/incompetent investigators, the ones who aren't a bit like us. It is the work of those who, like us, believe they are doing the right thing, and what is more, it is embedded into the system. The authors describe how official instruction in these techniques, described in handbooks and training programs, is already infected with dissonance-reducing justification and ways to attain it. One criminal interrogation manual notes that a suspect who denies involvement will subsequently find it harder – because of dissonance – to admit that they are responsible. Interrogators are therefore advised to watch for signs that the suspect is about to make a denial, and take steps to prevent them from doing so.(143)
The culture of an endeavor is to a large extent responsible for the tendency of its practitioners to become swallowed up by the voracious appetite of justification. In those endeavors where competent, successful work is marked by the capturing of a prize – the conviction of a suspect, the uncovering of a repressed memory, the bagging of a result in some form or another – the pressure is on to do that. Any setbacks along the way, any obstacles or difficulties, provoke discomfort and dissonance, and the urge to reduce it; the more the culture demands that prize, the greater the chance that the first step off the top of the pyramid will be in a dangerous direction.
Science cherishes its commitment to method, rather than to results. There is no scientific shame in advancing a theory which turns out to be wrong; this is part of the endeavor. The scientist, according to the demands of the discipline, subjects a posited theory to tests designed to expose its weaknesses. (Compare this with therapy, or crime detection, in which failure to attain the result does indeed represent a failure.) Even where an individual scientist, like Wakefield, falls under self-justification's spell, the culture of criticism ensures that this will sooner or later be exposed by the community.
There are no such safeguards built into crime detection or therapy. There is on the contrary an in-built horror of getting things wrong, an aversion which makes it harder for practitioners to see, never mind admit, their errors. One might expect an error-averse culture to produce fewer serious errors, but in fact this is not the case. Once again, dissonance theory predicts the unexpected outcome: the more error-averse the culture, the more likely that dissonance will push practitioners down the wrong side of the pyramid, with error compounding error, and every step making it harder to climb back up again. Medical practice, increasingly, is marked by a fear of litigation; the effect is to make admissions of error or responsibility harder to make, to make clinicians more anxious about scrutiny of their activities, and to make self-justification an automatic response to criticism. Acknowledgement of this problem is (belatedly) making its way through the profession, but it remains to be seen whether it can overcome the horror of getting things wrong which has made such a mark on medicine.
But Mistakes Were Made is not just an opportunity to indulge in there-but-for-the-grace observations about others; it serves as a warning, and a prompt to look more closely at basic assumptions and practices of our own, not just on the scale of cultures and professions, but even in our own personal lives. Taking just one example at that level, dissonance theory explains why our intuitions about catharsis are wrong, and why experiment after experiment shows that people who are given the opportunity to vent their anger will afterwards feel more, and not less, animosity towards its object. Aggression requires justification, which in turn justifies further aggression: controlling, not expressing, our anger is what will return our blood pressure and equilibrium to normal, and allow us to let go of the unhealthy ire.
This is a marvelous book on many levels, but it is not without its faults. Occasionally, one feels, the authors fail to make the important distinction between making happiness out of one's choices and trying to justify oneself into happiness. For all its sophistication, it is sometimes unexpectedly naïve or even crude about human motivation and behavior, too inclined to speak of long outdated theories as if they held any currency. Similarly, the chapter on memory opens lamely, appearing to be aimed at demolishing an account of memory which would have been regarded as behind the times by philosophers 200 years ago. Their account of evidence and how it works in the scientific context, likewise, appears to have been bypassed by philosophy of science since the early 1900s. There are stylistic complaints too; occasionally — which is to say too often — it adopts an awkward faux-conversational tone, or an injudicious rebarbative vernacular, both of which grate.
But these are, in the context of the book's successes, quibbles. Tavris and Aronson are to be congratulated for this immensely engaging and intelligent volume, for shedding some illumination on that dark side of human behavior when it starts to go wrong and then gets horribly worse.
© 2007 Daniele Procida
Daniele Procida teaches philosophy at Cardiff University and writes on a variety of philosophical and other topics. His most recent work is "Flying", an essay on the mythology of air travel for the exhibition Moth by Richard Powell.
Categories: General, Psychology