BIAS
You are probably familiar with the idea of cognitive bias: a trick of the mind that stops you seeing what’s in front of you or thinking clearly, something that’s a recognised cause of diagnostic error.
There is a whole menagerie of biases with mundane or more exotic names such as base-rate neglect, the gambler’s fallacy, and the Pollyanna principle,1,2 and wandering through this psychological zoo for any length of time may make us question how much we ever get right. Of course, if you’re prone to exceptionalism, you might just smile and shake your head at how easily other people get in a muddle.
WE HAVE TWO WAYS OF THINKING
The idea of cognitive bias is connected with the idea that we have two complementary ways of thinking.3,4 What we usually understand as thinking — consciously processing information step-by-step in a way that we can readily explain — turns out to be quite slow. It can only handle small sets of data, and only if they’re readily available, limiting its usefulness.
The other kind of thinking is far more powerful, integrating huge datasets from multiple sources below our level of conscious awareness, including memory and subtle environmental cues. It is so fast and distinct from its slower cousin that we don’t even consider it thought, referring to it instead as intuition, gut feeling, or acting-in-the-moment. Its weakness, though, is that it is prone to cognitive bias. To some extent of course we can recognise this and make allowances. The tendency to bias in our fast thinking, however, is so fundamental that it actually demonstrates a much bigger truth.
BLIND SPOTS
Fast thinking relies on heuristics: cognitive short-cuts and rules of thumb that work by association and extrapolation to bridge our mental gaps and fill in the blanks. We rely on these heuristics not just in our thinking, but also in our day-to-day experience of reality. We all have a literal blind spot in our visual field of which we’re so unaware that identifying it feels like a party trick.5
Likewise, our perception of colour is limited to central vision, yet there is no point beyond which we suddenly experience our surroundings in black-and-white. Cognitive blindness prevents us from seeing something unexpected even when it stands in front of us and waves.6
In each of these cases, we are integrating incoming sensory data with extensive background knowledge of how things usually work to make sense of the world around us. We do this so smoothly that we don’t even realise: we think we’re simply observing reality, when in fact what we see depends at least partly on what our brain tells us to see.
Cognitive bias affects our perception and thinking in a similar way during the consultation. It’s not just that we commit errors: our entire mental operating system is built in a way that makes them inevitable.
HUMANS ARE FALLIBLE
Can we ever hope to see things clearly, then, or are we doomed to founder in a sea of subjectivity and misdiagnosis? We all fall off the tightrope now and then, and a safety net is a sensible precaution, no matter how confident we are in our skills. More than this, though, being open about our propensity to error with ourselves, our patients, and our colleagues makes it easier to learn from experience.
Every diagnostic ‘failure’ becomes another piece of evidence to inform and refine our thinking next time. Patients tend to value doctors who are fallible but willing to help them through life’s difficulties by embracing their own part in the drama, as one person with another, rather than pursuing some false vision of objectivity.
We will never be free from bias or error. Our security lies not in perfection, but in recognising and learning from our imperfections.
- © British Journal of General Practice 2022