Statistics from Altmetric.com
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.
Emergency medicine is a varied and exciting specialty in which the aim is not always to confirm a diagnosis but to be safe and appropriate in your management of potential diagnoses. Medical and nursing staff are therefore taught risk stratification of the presenting signs and symptoms they see. For example, generally it is far more important that the severity of respiratory distress is recognised, rather than its underlying cause. A correct diagnosis of bronchiolitis is irrelevant if you have missed the fact that the child is peri-arrest due to hypoxia and respiratory fatigue.
In their qualitative study of junior doctors’ decision-making Bowen et al1 explore how experience impacts on decision-making in the ED. They focus on breathing difficulty in under 5-year-olds but their work is relevant to many conditions. Not unexpectedly junior doctors and nurses rely on guidelines and admission criteria and are generally risk adverse. Feeling safe is important to them2 but as you become more experienced, ‘intuition’, however that might be defined, plays a more prominent role. It is surprising how little we concentrate on the process of making a diagnosis during training. Different diagnostic approaches have been identified3 between junior doctors and we are increasingly realising how gut feeling impacts on decision-making.4 ,5 How guidelines might interact with experience and gut feeling has yet to be completely delineated but clearly in some cases guidelines trump gut feeling and vice versa. The ROLMA matrices6 are 2×2 tables to aid visualisation and understanding of how evidence-based decision-making is related to clinical outcomes. They were devised during the creation of a novel framework to evaluate educational interventions.7 table 1 shows the ROLMA 2 matrix which divides clinical management decisions into evidence-based and non-evidence-based and compares with clinical outcomes. The matrices describe what might be easy to describe qualitatively but difficult to quantitatively audit, that it is possible for a senior clinician to effectively ignore national or local guidance but for the patient to have a positive outcome.
At what stage do you have sufficient experience to make independent decisions that are potentially inconsistent with suggested practice?
A 2-year-old presents with fever and tachycardia (195 bpm) with a prolonged capillary refill time. The mother describes a lethargic infant who she has never been this ill. On ‘paper’ this child meets many criteria for investigation and potential treatment for sepsis. An experienced consultant may feel this child is unwell secondary to distress from viral illness and observe closely. An hour later the child is running around the department with a snotty nose.
It can be difficult to teach how to discriminate between features of illness which are on a spectrum rather than absolute (apnoea is an absolute, extent of retractions/recession a perspective). It is harder still to bring multiple different subjective clinical features together to confirm or refute a diagnosis. There is a risk in trainees becoming falsely confident of their skills following training if they don't have the experience to place this learning in context. Conversely we could produce a risk-averse generation of doctors if they are not pushed to explore how guidelines provide a framework not manual of care.
In their conclusion Bowen et al challenge us to think about how to develop learning programmes to help us deal with the ever growing demand for paediatric acute and emergency care. But what type of programme do we need to implement and do we really know how junior doctors think? This question has started to be answered through the lens of ‘Dual Cognition Theory”. This is the separation of automatic, instinctive cognition (type 1 thinking) and more reflective, analytical thinking (type 2) popularised by Daniel Kahneman in his book Thinking Fast and Slow.9 Adams et al3 interviewed 37 doctors using this framework and found both types were used but across three stages: Case Framing, Evolving Reasoning and Ongoing Uncertainty. In Case Framing the junior doctors primarily use type 1 thinking to make a decision about whether a patient was well enough to undertake a history and examination or features were present which required urgent intervention. Adams et al highlight that premature closure (convincing yourself you have adequate information to make a diagnosis) can be a problem for junior doctors at this stage and suggest encouraging follow-up of cases to avoid how certain cues, taken from ambulance handover for example, may lead to incorrect diagnosis. Evolving Reasoning uses a combination of type 1 and type 2 thinking. This stage, involving the application of pattern recognition is under-utilised, in medical education. Also, although reflection is a core component of personal development, skills to adequately reflect are not standardised and as a result opportunities to improve pattern recognition are lost. In emergency care Managing Uncertainty is vital but it is likely we underappreciate how challenging junior staff may find it. If educators are not sensitive to this it is possible they may be encouraging cognitive biases rather than preventing them. In their paper they quote a junior doctor
FG2: ‘If I ever have a patient I think has a chest infection CURB65 is just amazing because then you can say, their score is this, therefore, they do not need to be in hospital.’
To avoid uncertainty the doctor may become too reliant on scoring mechanisms/guidelines which will not always serve the clinical question being asked. This is a form of information bias (the tendency to seek information even if it isn't relevant to the outcome). Rushing to make a diagnosis, or at least have a management plan, increases the risk of anchoring bias. However understanding how to use evidence, or that observation must be a proactive not passive process, means managing uncertainty needs specific training.
How can educators be more mindful of how learners are taught to understand how they process information and would it be possible to reduce unnecessary delay in the transition of a learner to an expert’s level of cognition? There are a number of potential strategies
I. Provide adequate and appropriate clinical exposure. This is not just time spent seeing patients, it is time spent being exposed to the challenge of decision-making. Shifts only in the Resucitation area will not help you understand the decision-making process relevant to sending home a 90-year-old gentleman following a fall. Seeing hundreds of teenagers with injuries will not improve your ability to escalate, or de-escalate, treatment in a child with moderate wheeze and respiratory distress. As trainees move through the system they will approach the same clinical dilemmas with a different mindset and philosophy as they find a balance between type 1 and type 2 thinking, therefore, it may be necessary to repeat attachments in certain clinical areas (ie, paediatrics) if they have not been exposed to them for some time.
II. Don't assume teaching about cognitive biases will enable learners to avoid making them. Sherbino et al demonstrated diagnostic errors didn't reduce following a focused intervention from experts in clinical reasoning teaching cognitive strategies.10
III. Trainers must consider discussing why decisions were made, not just what decision was made.
In order to support decision-making clinical exposure is needed with, especially in the early years of training, close and effective oversight. As pressures on services perpetually increase there may be a time when systems fail because they are overwhelmed and because staff do not have breadth of appropriate skills to deal with the complexity of demand. Teaching and understanding decision-making needs to become an essential component of all training programmes.
Competing interests None declared.
Provenance and peer review Commissioned; externally peer reviewed.