It’s a Trap!

its-a-trapNot only do we have to battle with diagnostic uncertainty, a high intensity and stressful environment but we also have to contend with our own minds and the tricks and biases they play on us. As if things weren’t hard enough! It may come as no great surprise that a lot of work around biases has come from Kahneman and Tversky. There has been coverage of System 1 vs. System 2 cognition in the FOAMed world12and discussions at SMACC. Different cognitive biases represent failures of both systems of cognition to varying degrees. Often biases arise as an information processing shortcut based on past experience or knowledge. Alternatively they may arise due to our own emotions or the social environment in which we find ourselves.

If you haven’t read ‘Thinking, fast and slow’ by now then you really ought to. Kahneman alludes to a number of different failures in thinking; anchoring, availability, substitution, loss aversion, framing and sunk-cost.

This post will cover some of these and some other biases and consider their relevance to emergency medicine. The first step to beating biases is being aware of them!

Anchoring Effect

This refers to our predisposition towards being influenced by useless information. Hard to believe isn’t it?! The example used by Kahneman is to consider the following questions:

‘Was Gandhi older or younger than 114 when he died? How old was Gandhi when he died?’

Because of the first question, in which 114 is the anchor, people will subsequently use that as a reference point and overestimate Gandhi’s age at time of death as a result compared to if they were asked the second question alone. This happens in the ED a lot. We read the triage sheet or take ambulance handover and then subsequently get anchored to the information we have been told. Or perhaps a patient has had a random selection of blood tests taken a triage and we notice an abnormality. This abnormality may be irrelevant to the presenting complaint but may well lead us down a path to a different, incorrect diagnosis. This ties in closely with the confirmation bias, in which we seek information to back up our own viewpoint and disregard information that contradicts this.

 

Availability Heuristic

This is another commonly occurring problem in the ED, and probably in medicine as a whole. Whilst the old adage ‘common things occur commonly’ is undoubtedly true, this bias refers to the idea that if something is easy to be recalled then it must be important. We probably see this manifest through the n=1 anecdotes that we have all been told at some point. I have no evidence but wonder if this leads to over investigation or mis-investigation (which is a new word I’ve just made up).  If something is fresh in our minds then we are much more likely to recognise it; equally, the rarer something, or more unfamiliar something is then the less likely we are to recognise it.

 

Substitution

When confronted with a difficult or complex decision to make, we will often resort to a simpler heuristic. In doing so we may never actually answer the question that was asked of us. We often weight decisions based on familiarity, and when challenged and out of our depth we may draw on this experience and allow that to influence our decisions and actions. It ties in with not knowing what you don’t know. Care should be taken to recognise when we are oversimplifying a complex problem by imposing a more familiar, and simpler heurisitic.

 

Loss Aversion

Humans do not like to lose. They do not like they uncomfortable feeling that accompanies loss. In fact, humans have so little enjoyment for the sensation of loss or losing that we have developed an optimism bias. We downplay the risks in our lives and disregard them in a ‘never going to happen to me’ attitude. Drug errors, missed diagnoses and other critical incidents can occur because we assume that they will never happen to us. Thankfully we are increasingly using checklists for particularly high-risk events, as a gentle reminder that they bloody well will happen to you. Pay attention to incident reports and serious incidents and make a note of what errors have occurred and be mindful in your own practice. Mistakes and bad things happen and they will happen to us all, but we can attempt to mitigate this by recognising our inherent optimism (whether we feel optimistic or not!). Equally, our patients will suffer from the same optimism bias, and this will influence their decision to engage with treatment or lifestyle advice. Smokers will downplay the risks associated with the habit, as unconsciously they may believe that detrimental effects will never happen to them.

 

Fundamental Attribution Error

This is perhaps the simplest and in my opinion perhaps the most important. The fundamental attribution error describes how we judge or interpret the behaviour or actions of others, or more accurately misjudge and misinterpret. The saying ‘don’t judge a man (or woman) until you’ve walked a mile in their shoes’ encapsulates this in a nutshell. On the background of our own morals, experiences, stresses, and mood we infer the behaviour and intent of others. If someone knocks into you in the street and walks away without apologising, how you interpret that persons intent will depend on those variables listed. The point is that you do not know what the other person is thinking or experiencing and therefore should be careful when judging their intent. Not everyone who knocks into and walks away with apologising is a d**k, so be careful not to judge them as such. Of course, some people are just d**ks and they, sadly, are unavoidable.

Acknowledgement of cognitive biases is one thing, doing something about them is an entirely different issue.  The existence of them can be taught, but the recognition of them in oneself has to be practised.  I doubt anyone in the world has complete awareness and mastery of the biases that influence their decisions and behaviour, but we should all be on the path to recognition.  In the words of our own Iain Beardsell:

Cheers

Rich

@richcarden

Before you go…

1.
Laing S. SMACC Day 1. heftemcast. http://www.heftemcast.co.uk/smacc-day-1/. Published June 14, 2015.
2.
Weingart S. OODA Loops. EMCRIT. http://emcrit.org/podcasts/ooda-loops/. Published April 2, 2016.

Cite this article as: Richard Carden, "It’s a Trap!," in St.Emlyn's, September 25, 2016, https://www.stemlynsblog.org/its-a-trap/.

3 thoughts on “It’s a Trap!”

  1. A whole lecture circuit has been created by this topic.

    The taxonomy of cognitive biases is fundamentally addressed by asking oneself the questions of ‘What have I forgotten or what did I miss?”. There is no solution with deficient knowledge – ‘Unknown unknowns’

    But currently there is poor evidence that teaching metacognitive strategies (thinking about one’s own erroneous thinking) reduces error.

    Assoc Prof. Jonathan Sherbino who spoke at SMACC has published a few interesting articles and studies challenging the view that this is simply a System I versus II problem

    http://www.ncbi.nlm.nih.gov/pubmed/26825476 – “Is bias in the eye of the beholder? A vignette study to assess recognition of cognitive biases in clinical case workups.”

    http://www.ncbi.nlm.nih.gov/pubmed/21240788 – “The effectiveness of cognitive forcing strategies to decrease diagnostic error: an exploratory study.”

    http://www.ncbi.nlm.nih.gov/pubmed/24362377 – “The etiology of diagnostic errors: a controlled trial of system 1 versus system 2 reasoning.”

    My argument is that experts operate in System II all the time. The problem occurs when we forget to Toggle in the face of Red Flags. This problem is worse for novices who don’t recognise them.

    http://emedsa.org.au/CoreMed/2016/07/24/red-flags/

  2. Pingback: Global Intensive Care | It’s a Trap!

  3. Great piece Rich! There is some discussion to be had as to how much these heuristics represent ‘failings’ of cognition. Much of the work that Kahneman produced used students in an artificially constructed, experimental setting to, some have argued, deliberately invoke ‘flawed’ decision making.

    Studies of pilots, firefighters, mechanics and other professionals making intuitive Type II decisions in a real life setting show much more favourable results particularly as expertise is reached. I think we’d struggle to make any decisions without these inbuilt ‘failings’.

    Gary Klein’s book ‘Streetlights and Shadows: Searching for the Keys to Adaptive Decision Making’ explores this in detail and is a great read, and Scott’s interview with the author is equally illuminating

    http://emcrit.org/podcasts/decision-making-gary-klein/

    Fantastic post on a really interesting area

Thanks so much for following. Viva la #FOAMed

Scroll to Top