rss
Postgrad Med J 88:301-302 doi:10.1136/postgradmedj-2012-130978
  • On reflection

Being wrong

  1. John Launer
  1. Correspondence to Dr John Launer, London Deanery, Stewart House, London WC1B 5DN, UK; jlauner{at}londondeanery.ac.uk
  1. Contributors John Launer.

I was talking yesterday with a friend from Nice in the south of France. Nice is one of my favourite holiday destinations and so I told her the places there that I liked the most—including the Chagall Museum and a picturesque part of town called the Cours Saleya. I said that I especially loved a pretty little enclosed market called ‘Le marché des enfants rouges’. I described how we had eaten a delicious meal at one of the open air restaurants there. The friend was puzzled: she had lived in Nice for many years but never been there or even heard of it. At this point my wife intervened in the conversation and said she wasn't sure the market was actually in Nice. She thought we had probably come across it somewhere else in France. Somewhat irritably, I went to fetch our Blue Guide to France, to prove I was right. Alas, I was quite wrong. The market is in Paris. It will come as no surprise if I tell you that, until I opened the guide book, I would have sworn blind it was in Nice. I would even have described in detail how I walked there from the seafront and past the Picasso Museum—which would be very odd since Paris isn't on the sea and Nice doesn't have a Picasso Museum.

In a fascinating book called Being Wrong, the US writer Kathryn Shulz examines incidents like this, and very many more.1 She describes the myriad ways in which human beings can get things wrong. She offers an encyclopaedic account of everything from mirages and optical illusions to false memories and profound beliefs in the absurd. Shulz demonstrates how almost everything we do, say and believe as human beings is accompanied by certainty, even though we generally depend for our knowledge on highly selective physical senses, insufficient evidence, inaccurate memories, or the beliefs of those around us. Her book is punctuated with some wonderful quotations on wrongness. ‘It infuriates me to be wrong’, wrote the playwright Molière, ‘when I know I'm right’. The philosopher Ludwig Wittgenstein put it more subtly: ‘“I know” seems to describe a state of affairs which guarantees what is known, guarantees it as a fact. One always forgets the expression, “I thought I knew”’.

The central argument of Shulz's book is that being wrong is innate to the human condition. Indeed, she points out that wrongness is virtually unique to our species, an intrinsic part of our capacity for reason and creativity. She reminds us that all science depends on falsification, and on ‘the permanent possibility of someone having a better idea’. She points out the connection between error and humour, which often depends on mistakes or absurdity. She also makes the link with art and literature, themselves a form of controlled illusion. Above all, she argues that we should accept our capacity to make mistakes, and celebrate it as one of our most valuable attributes.

Leading scientists

The case made by Kathryn Shulz is supported by two books about error that have been published by leading scientists in the past year. One is by the Nobel laureate Daniel Kahneman, and the other by Robert Trivers, who is among the world's leading evolutionary theorists. They bring two quite different lenses to the problem of wrongness. Kahneman—a psychologist who specialises in decision-making and uncertainty—examines how we intuitively reach conclusions that are entirely at odds with true statistical probabilities. Trivers addresses the fundamental question of where our capability for error actually comes from. His view is that it is hard wired into the human brain, as a consequence of natural and sexual selection. Neither writer has any illusions about our propensity to be mistaken.

Daniel Kahneman carried out his original research in the 1970s and 1980s with his colleague Amos Tversky. They conducted an extraordinary series of experiments to show how human judgement is affected by biases and ‘heuristics’: unconscious rules of thumb that serve us well in many situations but can also trick us into certainties that are demonstrably and sometimes dangerously false. In Thinking, Fast and Slow, Kahneman describes decision making in terms of two more or less separate forms of human mental activity.2 System 1 ‘operates automatically and quickly, with little or no effort and no sense of voluntary control.’ System 2 involves the application of concentrated effort to complex computations, and is entirely conscious. Very often, System 1 lets us down by jumping to conclusions. It is too easily distracted by the superficial features of a problem, when it ought to stand aside and let System 2 do the hard work needed for an informed solution.

Many doctors will be familiar with some of the ideas that Kahneman and Tversky first spelled out. These include the ‘availability heuristic’ which leads us, for example, to make the first diagnosis that comes to mind and fits a patient's symptoms. Often, it prevents us doing anything further to explore other hypotheses, even though several diagnoses might actually fit the same clinical picture. Among other forms of bias relevant to medicine are the ways that we consider risks. If you offer information to patients in one way (‘a 90% chance of survival’) it will evoke a very different response from presenting it in another form (‘a one in ten chance that you will die tomorrow on the operating table’). In his book, Kahneman presents a huge range of examples to convince readers that our minds are brilliant instruments of persuasion but lousy statisticians. This fact can lead to errors costing thousands of lives and billions of pounds.

One of the most compelling stories in Kahneman's book concerns research into schools in the USA. The research showed convincingly that the most successful schools, on average, were small. On the basis of this finding, the Gates Foundation poured US$1.7 billion into a programme to divide large schools into smaller ones. The programme was then supported by several other major funders. It was only later that statisticians pointed that the worst schools in the USA, on average, are also small. Both figures are the consequence of a simple statistical artefact: any sample involving small numbers of subjects (in this case, numbers of children per school) will provide more extremes in either direction than if you look at larger numbers. In fact, taking the US school population as a whole, children at big schools attain better results.

Self-deception

Like Kathryn Shulz, Kahneman asks how we get things wrong so consistently, but neither writer addresses the underlying question of why we do so. This is exactly what Robert Trivers attempts to do in his new book Deceit and Self-deception.3 Trivers's reputation rests mainly on his research into altruism, and on his theory explaining generosity within the family in terms of promoting self-interest by indirect means. His latest work covers a darker side of evolution: how we also promote our interests by ignoring or suppressing knowledge that might deflect us from our purposes. The book carries the self-explanatory subtitle: ‘Fooling yourself the better to fool others’. At the core of Trivers's argument is the idea that deception is an essential tool—perhaps the key one—in the struggle for survival and reproduction. Self-deception is simply a way of deceiving people more efficiently: if we conceal information from ourselves as well as others, it will improve our performance and sense of control, and reduce the cognitive effort involved.

Trivers identifies several distinct categories of self-deception including self-inflation, distinctions between in-groups and out-groups, moral superiority and false personal narratives. His book covers everything from the neurophysiology and immunology of self-deception to its manifestations in fields as wide as sex, the family, everyday life, aviation and space disasters, nationalism and war. Some of his most detailed and poignant accounts of the errors arising from self-deception include his close analysis of how managers at NASA ignored or suppressed information that could have prevented the ‘Challenger’ and ‘Columbia’ disasters. He also looks at how governments systematically conceal or deny realities, such as the Turkish genocide of the Armenians, or the absence of any information to justify claims by the Americans and British that Saddam Hussein had weapons of mass destruction.

Engaging with error

Although Deceit and Self-deception is at times depressing, it has an upbeat ending, with the last chapter entitled ‘Fighting self-deception in our own lives’. Indeed, a theme shared by all of these three books is that it is perfectly possible to become aware of our mistakes, and to turn them into a source of self-knowledge and even pleasure. In professional contexts, including medicine, noticing our errors can of course be a significant way of achieving patient safety and the continuous improvement of quality. Rather than being in denial about errors, getting furious about them, or attempting to cover them up, we should welcome each one as possibly the best and most honest feedback we can ever receive about our own performance.

In the course of today, you will systematically fool yourself in very many ways. You will miss the point in many conversations, assert facts that are untrue, take offence when none is meant, see things that aren't there, recollect events that never happened, make uninformed decisions, and assert strong opinions for which you have little or no evidence. You will also notice others regularly doing all these things, while you spend most of the day possessed by the confident certainty that you alone are not. If Shulz, Kahneman and Trivers are right, you will behave like this simply because you are human. But according to all three of them you also have a choice: to accept what is going on, laugh at it, and learn from it. As the old adage has it: ‘I have learned so much from my mistakes, I intend to make many more’.

Footnotes

  • Competing interests None.

  • Provenance and peer review Commissioned; internally peer reviewed.

References


Free sample
This recent issue is free to all users to allow everyone the opportunity to see the full scope and typical content of PMJ.
View free sample issue >>

Don't forget to sign up for content alerts so you keep up to date with all the articles as they are published.