Article Image

Cognitive Bias is Ubiquitous in Medicine. Here’s How to Protect Against It.

Op-Med is a collection of original essays contributed by Doximity members.

Cognitive bias is ubiquitous in medicine: it’s been measured in 80%-100% of practicing physicians. Such faulty cognition is associated with suboptimal clinical reasoning — research finds that cognitive bias is responsible for 3/4 of medical errors, over half of which pass unnoticed. Thankfully, a little awareness goes a long way. Below, I discuss the most common cognitive biases among clinicians and offer a method to protect against them.

The Cognitive Biases That Afflict Clinicians

Many experienced clinicians are using a processing style that invites bias and allows error. This is because after several years (about six) of practice, physicians switch from referencing biomedical information to making diagnoses utilizing illness scripts, linking conditions for a disorder (age, medical history, heredity, lifestyle, etc.) to consequences of that illness (such as functionality, symptoms, and their course). While novice practitioners struggle to form deductive hypotheses that they can test for accuracy, experts rely on this pattern matching. It represents a lower level of cognitive processing which, unfortunately, excludes awareness and consideration of outlying, missing, and atypical data. 

Experts also progressively focus on positive feedback about their performance, resulting in inadequate reflection and consideration of past mistakes they can work to correct. As a result, they don’t perform any better than their less experienced colleagues with complex medical dilemmas. 

In addition, physicians tend to make diagnoses too quickly and formulate the same ones over and over, based on our individual experience and narrowing mindsets. We then search for corroborating data to support these limited hypotheses. Too often we ignore data that does not fit, neglect to fill in missing information, and misinterpret atypical presentations. This represents confirmation bias, which we overcome in clinical research by employing randomized controlled studies and systematic reviews. 

As expertly trained clinicians, we tend toward overconfidence in our assessments. However, this means we end up drawing rapid conclusions based on our individual experience, often neglecting the reflection that helps avoid clinical mistakes. Too many of us only create differentials when we acknowledge uncertainty. When we are (over)confident, we see that as unnecessary. 

Further, physicians tend to over-rely on memory. We recall only diagnoses, not the raw data on which they were based. Any recollection is actually a recreation that may be distorted by circumstance, never a duplication of an original hard copy, like on our computer’s hard drive. If we are not maintaining and referring to detailed documentation, we will easily distort our previous evaluations to the point that we will even “recall” symptoms that were never present. 

A Reminder of Better Clinical Reasoning

Fortunately, there are cues available to remind us of better cognitive practices that enhance outcomes for our patients. The most salient one is also the most famous: the Goldwater Rule (GW). The GW is a regulation from the American Psychiatric Association (APA) that “prohibits APA member-psychiatrists from discussing the mental health of individuals without assessment or consent.” It was adopted in 1973 after journalists prompted too many psychiatrists to offer judgment on senator and presidential candidate Barry Goldwater’s mental health. The rule has been revisited and upheld several times since.

The GW does several things: it helps protect the professional integrity and public image of psychiatry, keeping us from making random guesses based on sparse data and looking like partisan hacks or fools. And in the context of cognitive practices, it reminds all clinicians — psychiatrists or not — of the unconscious mental shortcuts we so often employ that degrade our ability to make the most accurate diagnoses and best treatment recommendations … by requiring us to actually perform consensual and thorough examinations and complete evaluations prior to offering any conclusions. 

Of course, we all think we comply with this. Research shows, however, that after settling on a diagnosis our reasoning ends and we fail to gather more information or to consider evidence that has yet to be analyzed. We do not recognize important information gaps and neglect to generate new diagnostic questions to help fill them. Clinicians do not typically utilize enough cognitive skills, averaging only about 4.4-4.7 per case out of a theorized 24 that are necessary. The GW cues us to not form any judgements until all the data has been gathered and understood. 

Further, the GW reminds us that we are almost always dealing with incomplete information — and not every method of responding to incompleteness will generate optimal results. For instance, when we have very little information and our uncertainty is high, we revert to ‘eristic’ reasoning (which is typified by making assumptions about people we have not fully examined). Determined by hedonistic urges, eristic reasoning is based on wishful thinking, desire to maintain the status quo, aversion to loss, and, again, overconfidence. We may be cued to our eristic thinking when we realize we are using non-logical approaches, such as beliefs, strong emotions, prejudice, and protecting economic gain.

Similarly, when we attempt to make clinical decisions with moderate amounts of data and uncertainty, we employ ‘heuristic’ reasoning. Many heuristics are hardwired in our brains to help us estimate rather than calculate appropriate responses when data is limited and uncertainty is moderate. Examples such as the availability bias, representativeness, anchoring, and the framing effect have been researched and popularized by cognitive psychologists Amos Tversky and Daniel Kahneman as “fast thinking.” 

We must be aware of when our clinical judgment is based on these shortcuts that substitute individual experience for broad experience. This form of reasoning often leads to erroneous conclusions that could have been avoided with more complete assessments. We can recognize heuristic thinking when we are seeking truth but relying on analogy and past performance, and are unduly influenced by the consequences of potential outcomes.

Ideally, recalling the GW will inspire us to seek as much information as we can so that we can utilize a form of reasoning called ‘abductive’ and lower our uncertainty to the lowest possible level. With this logic, we form a differential, then test our hypotheses by further interviewing, testing, or treatment. Crucially, we then make sure to return to our model and revise it to fit all the new data we have gained. This is an iterative process, repeated until the uncertainty is so low that we can acknowledge we have reached the best solution we can. It often results not only in more accurate, but also simpler conclusions. 

With the GW in mind, we can exhibit better cognitive reasoning and reduce cognitive error in our clinical practice. Steps we can take to do so include:

1) applying humility; remembering our human limitations,

2) avoiding rapid diagnosis; always creating a differential diagnosis and following with competitive hypothesis revision,

3) completing semi-structured interviews to help us create sufficient differentials and detect important comorbidities, 

4) preserving, reviewing, and accounting for all of our data,

5) filling information gaps,

6) adopting a pluralistic approach to assessment and formulation,

7) always creating time for reflection on our work,

8) remaining aware of the method of clinical reasoning we are using, 

9) enhancing our communication skills with patients and peers, and 

10) listening to and asking for feedback on our performance, particularly negative feedback.

It is helpful for us to utilize some form of reminder to consistently pursue best practices. The Goldwater Rule can help remind us of better approaches to clinical reasoning, as well as the risks of overconfidence. Otherwise, we are on autopilot and at the mercy of brain structures that developed to help us socialize and survive — but not to provide the complex clinical answers our patients need in contemporary medical practice. 

What cognitive biases do you most often fall prey to? Share in the comments.

Dr. Putman is the author of “Rational Psychopharmacology: A Book of Clinical Skills” and “Encountering Treatment Resistance: Solutions Through Reconceptualization.” He blogs at drpaulputman.com. Dr. Putman is a 2024–2025 Doximity Op-Med Fellow.

Animation by Jennifer Bogartz

All opinions published on Op-Med are the author’s and do not reflect the official position of Doximity or its editors. Op-Med is a safe space for free expression and diverse perspectives. For more information, or to submit your own opinion, please see our submission guidelines or email opmed@doximity.com.

More from Op-Med