Medicine in small doses
A 65‐year‐old woman, with a longstanding history of lower back pain on non‐steroidal anti‐inflammatories, presents to the emergency department with crushing chest pain, with some radiation into the left arm and back. She is triaged as having ischaemic chest pain and despite minor electrocardiography (ECG) changes, is taken to the cardiac catheter laboratory and a coronary angiogram performed which shows no abnormality. She was admitted under the care of a cardiologist and later in the day was complaining of epigastric pain in addition to her back pain. She was referred to a gastroenterologist who arranged for her to have a gastroscopy the following day. That evening, she became hypotensive with peripheral circulatory failure and the rapid response team was activated. A computed tomography (CT) scan was performed which confirmed a dissecting thoracic aortic aneurysm. The cardiothoracic surgical team was consulted, emergency surgery was arranged, but she arrested in the holding bay, was asystolic and was unable to be resuscitated. A coroner's post‐mortem confirmed a dissection of the thoracic aorta extending into the abdominal aorta.
Cognitive biases are flaws or distortions in judgment and decision‐making which have become increasingly recognized as contributors to adverse events and other patient safety issues, including wrong site surgeries, delays in treatment, and, in this particular case, failing to recognize an alternative cause of chest pain, leading to a potentially preventable death.
Two processes in thinking and decision‐making help describe how cognitive biases manifest. The intuitive process associated with unconscious, automatic, ‘fast’ thinking, whereas the analytical process, known as deliberate, resource intensive, ‘slow’ thinking. Fast thinking responds to stimuli, recognizes patterns, creates first impressions, and is associated with intuitions (Kahneman D. Thinking Fast and Slow. New York: Farrar, Straus and Giroux, 2011).
Much of life's daily activities are performed using fast thinking, such as driving to work and knowing that 2 + 2 = 4, which do not consume much effort or working memory, and as such, fast thinking is often very useful, efficient and effective. However, it is imperfect and is predisposed to predictable pitfalls in judgment – cognitive biases. It may cloud the ability to consider different alternatives affecting the analytical process where reasoning and clinical decision‐making occurs. The Joint Commission reports that in healthcare 28% of adverse events, as the result of diagnostic errors, can be attributed to cognitive bias (https://www.jointcommission.org/assets/1/23/Quick_Safety_Issue_28_Oct_2016.pdf).
Saposnik et al. (BMC Medical Informatics and Decision Making 2016; 16: 138) conducted a systematic review of cognitive biases and personality traits (aversion to risk and ambiguity) associated with medical decisions, with two objectives being to identifying the most common cognitive biases and evaluate the influence of cognitive bias on diagnostic accuracy and management errors. In the 20 studies meeting the inclusion criteria involving 16 810 physicians, 19 cognitive biases were identified. These included: anchoring, availability bias, blind obedience, commission bias, confirmation bias, diagnostic bias/premature closing, framing effect, omission bias, overconfidence, tolerance to risk and satisfying bias. Of these, overconfidence, lower tolerance to risk, the anchoring effect and information availability biases were associated with diagnostic inaccuracies in between 35 and 77% of case scenarios, with 71% of studies showing an association between cognitive bias and therapeutic or management errors.
The cognitive bias most relevant to the case study, anchoring bias, involves giving weight to and reliance on initial information or impressions. In this case, significant weight was given to the initial impressions of the chest pain being caused by coronary artery disease, rather than any other cause such as a dissecting aneurysm or pulmonary embolism.