Don’t Trust Your Gut – Part 2
“The signature practices of science, including open debate, peer review, and double-blind methods, are designed to circumvent the sins to which scientists, being human, are vulnerable. As Richard Feynman put it, the first principle of science is ‘that you must not fool yourself–and you are the easiest person to fool.'”1
For which “sins” are we vulnerable? A better question is, do you believe you are vulnerable?
In the previous column of Follow The Science, we shifted our focus from the evidence-based medicine (EBM) pillar of research evidence to clinical experience and expertise. We discussed regression to the mean, one of the main influences that undermine the reliability of our experience and expertise. We will now turn to cognitive bias in this quarter’s column and conclude the series next time discussing industry influence.
As a reminder, the three pillars of EBM proposed by David Sackett are research evidence, clinical expertise, and patient preferences. In other words, these three pillars are the three filters we use to assess the value of evidence and guide our clinical decisions. See the previous columns, EBM as a Decision-Making Filter, What is the Best Evidence?, What EBM is Not Part 1 and Part 2 for a more thorough review.
Most of us want to be thoroughly research-based in our decisions for patients. However, if we reflected on our decision-making process, we would likely admit that a significant portion of our recommendations for our patients come from our past experiences and the expertise of others. In other words, we are using intuition, that of ourselves and others, to make many of our decisions for patients.
Is there a problem with this? Is our intuition fallible? The short answer is yes. Intuition, even when derived from vast experience and expertise, is fallible. This is proven by the fact that medical error continues to be one of the leading causes of death in the US (3rd leading cause in 2016).2 Additionally, the answer is yes because we are human and are all susceptible to cognitive bias.
What is cognitive bias?
When a patient comes in with a red eye and photophobia, which began with sudden pain upon awakening, our minds often go correctly to diagnose a recurrent corneal erosion. If a patient calls us on the weekend or after hours and presents with that history, we may feel comfortable prescribing without seeing the patient based upon history alone. What we are relying on to make that judgment are “heuristics” or “cognitive shortcuts that reduce cognitive burden by focusing on certain pieces of information rather than considering the full range of available information.” Heuristics are helpful when trying to synthesize a lot of information under a time constraint. However, heuristics or shortcuts in thinking can also lead us to errors in judgment. These errors in judgment are called cognitive biases and come in many forms.3
For a full review of cognitive biases in eye care, I highly recommend reading the excellent papers from Hussain, A, et al. in Survey of Ophthalmology4 and Schlonsky, A., et al, in Optometry & Vision Science.3 For now, I want to review the top three cognitive biases most influential in my daily clinical practice: availability bias, confirmation bias, and anchoring. You might call them the three great “sins” in clinical judgment to which I, and likely you, are most vulnerable.
Availability Bias
Availability bias happens to me most often after I read a good research article or go to a continuing education event. Whatever I learned about most recently or was most influenced by happens to be what I see in the next several days or weeks in practice. We tend to consider specific diagnoses for patients that more readily come to our minds and tend to forget diagnoses that haven’t come to mind recently. So we rely on what is most available. Years ago, when implementing new dry eye protocols to create a dry eye clinic at my practice, I tended to see everything as some form or manifestation of dry eye. Was it true? Maybe, but it likely led to overdiagnosis.
Confirmation Bias
Confirmation bias occurs when we ignore any evidence to the contrary of what we already believe to be true. It’s easy to fall victim to confirmation bias both in our patients that we previously diagnosed and new patients that present to us with an already established diagnosis from a previous doctor. I’ve seen this several times in new patients with a diagnosis of glaucoma. It is easy to assume the previous provider made the right diagnosis and perpetuate the care without asking, “Does this patient really have glaucoma?” Or maybe that optic nerve head thinning on OCT is physiological, non-progressive, and just “red disease.” Or there is sectoral pallor of the nerve, and this area is greater than the area of thinning, suggesting a previous vascular event. Confirmation bias leads us to error as we ignore the alternative explanations despite the evidence and confirm a diagnosis of glaucoma.
Anchoring
Say you have a patient presenting with a unilateral red eye that is also a contact lens wearer. You inspect the cornea, and there is an infiltrate near the limbus but no overlying staining with sodium fluorescein. Most of us would still prescribe an antibiotic for this patient because we are “anchored” by the history of contact lens wearing. But if this was not a contact lens wearer, would you still prescribe an antibiotic in addition to topical steroids for marginal infiltrates? Maybe? Anchoring occurs when we place too much emphasis on one piece of information that we learn early in the diagnostic process. Our decision-making process gets “anchored” on one feature, and we have difficulty moving past that initial finding in making the right decision.
Are you convinced? Do you think you are vulnerable to any of these systematic errors in judgment? The first step in mitigating cognitive bias is honesty and awareness. Being a Wise Practitioner (see previous post) is all about knowing what you don’t know and humbly accepting evidence contrary to personal experience and expertise.
That means that you, even you, as smart and reasonable as I know you are, are not immune from these sins or biases in judgment. This is the second reason why we can’t always trust our gut. First, it was regression to the mean. Now, it is our cognitive biases. Next time, we will turn to the uncomfortable topic of industry influence that affects our clinical decisions. Will there be any hope for our experiences and expertise?
REFERENCES
- Pinker, Steven. Enlightenment Now (Penguin, 2018), p. 390.
- Makary, MA, Daniel, M. Medical Error – the Third Leading Cause of Death in the US. BMJ 2016;353:i2139.
- Shlonsky, A, Featherston, R., et al. Interventions to Mitigate Cognitive Biases in the Decision Making of Eye Care Professionals: A Systematic Review. Optom Vis Sci 2019; Vol 96(11).
- Hussain, A., Oestreicher, J. Clinical Decision-Making: Heuristics and Cognitive Biases for the Ophthalmologist. Survey of Ophthalmology 2018; Vol 63(1), p119-124.
Dr. Klute owns and practices at Good Life Eyecare, a multi-location practice in Eastern Nebraska and Western Iowa. He is a fellow of the American Academy of Optometry and is certified by the American Board of Certification in Medical Optometry. He writes and lectures on primary eye care practice management, evidence-based medicine, glaucoma, and dry eye disease.