Casey and the case of the missed diagnosis

Meet Casey. She arrived at your emergency department one Saturday afternoon a few weeks ago. She’s a cheerful, previously healthy 19 year old part way through her second year at your local Higher Education institution.

She’s also just spent 4 days on Critical Care, and is the subject of a Serious Untoward Incident/Level 3 investigation about the care she received in the ED. Uh-oh, so what happened? Well, briefly, she came in with abdominal pain, was seen by one of the FY2s (junior doctors) and referred to the surgeons with ?appendicitis. She was fairly well, so she went to the surgical assessment area. The surgical team were busy, as they often are on Saturdays, and she wasn’t seen for several hours. When they got to her, she was seriously unwell and after some IV antibiotics and narrowly avoiding a CT abdo for ?perforation she was admitted to ICU where she recovered slowly from her DKA.

Pardon? Her what? How did that happen? How did the doctors get the diagnosis that wrong? And what has it got to do with the nursing staff?

That’s where it gets interesting; and time to introduce the concept of cognitive biases. You see, you could be forgiven for thinking that thinking itself is a fairly straightforward process. Information goes in, your brain weighs it and manipulates it a bit like a computer, then thoughts, opinions, speech, actions come it. Sadly not. Mental processing is affected by all manner of things: booze and drugs obviously, but also tiredness, hunger, emotions. And even if we’re sober, well slept, well fed, happy and adequately caffeinated we are subject to a whole variety of bearpits. So let’s use Casey to illustrate just a few of them:

Anchoring

Casey herself when she came into triage complaining of abdominal pain said “I think I might have appendicitis”. This influenced all the subsequent decisions; rather than asking the general questions that he might have done if she’d said “I have tummy pain”, the triage nurse concentrated on appendix-related questions, like “Have you been vomiting?” (yes). The doctor continued this in her assessment, finding that Casey’s abdomen was tender, but not really noticing the depth of her respirations.

Availability

This relates to the brain’s ability to access more recent information more easily. Ever noticed that when you have had a teaching session on a topic or there’s been a news piece about a particular disease, the department suddenly seems to be full of patients with the same thing? It isn’t, it’s your brain labelling them more easily. In Casey’s case, the doctor had rotated in December from an FY2 job on the surgical assessment unit, where she had seen multiple patients with (guess what?) appendicitis. This had been reinforced by a recent teaching session on “Abdominal Catastrophes not to miss”. So her brain was primed to label Casey’s symptoms as relating to her abdomen; she excluded the other big catastrophe (the ectopic pregnancy) but her differential diagnosis was limited.

Confirmation bias

This is the tendency to interpret new data in the light of conclusions you have already reached (even if you don’t realise you’ve reached them yet). So when Casey was tachycardic and tachypnoeic (fast heart rate and breathing) but didn’t have a temperature or raised white cells, the doctor rationalised these: she said to herself that Casey’s physiology just hadn’t had a chance to develop pyrexia or raised white cells, rather than asking whether there was an alternative diagnosis that might fit better.

Dunning-Kruger

Not knowing what you don’t know. Also known as the confidence-competence mismatch. Or, to quote the late John Hinds “Being a d*** in resus”. This tends to come in waves as people (docs, nurses and AHPs) progress through training and seniority. In this case the FY2 was beginning to find her feet. She felt like a valuable part of the team, had just passed her first postgraduate exam, and was about to be interviewed for a training job in surgery. She had moved from “conscious competence” at one level to “unconscious incompetence” in the next (more of this in a later blog).

Satisficing and premature closure

Also known as “why you miss the second fracture”. So there is a human tendency to find one solution and stop thinking. Casey’s doc had found a diagnosis that fitted (mostly). Carrying on after this with the mental effort of making sure there isn’t a better fit is hard work, particularly in our time-pressured environment, where it may be much harder to say “actually it isn’t appendicitis after all and no, they can’t go to surgical ambulatory care, they need to be moved into resus”.

Next time: how to build your ED to mitigate cognitive bias.

More information

Quick reference guide to cognitive biases in medicine from Skeptical Medicine Natalie May and St Emlyns do a deeper dive

A note of caution about cognitive biases from Life in the Fast Lane

The invisible gorilla – a whole book of fascinating stuff on how our brains deceive us

RCN Emergency Nursing Framework: CCT 1.1.3, CCT 1.1.4, CD1 6.1.3

Comment

There is no comment on this post. Be the first one.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.