Article Text

Download PDFPDF
What is the inter-rater agreement of injury classification using the WHO minimum data set for emergency medical teams?
  1. Anisa Jabeen Nasir Jafar1,
  2. Jamie C Sergeant2,3,
  3. Fiona Lecky4,5
  1. 1 HCRI, University of Manchester, Manchester, UK
  2. 2 Centre for Biostatistics, School of Health Sciences, Faculty of Biology, Medicine and Health, University of Manchester, Manchester, UK
  3. 3 Centre for Musculoskeletal Research, Arthritis Research UK Centre for Epidemiology, University of Manchester, Manchester, UK
  4. 4 Health Services Research, University of Sheffield, Sheffield, UK
  5. 5 Emergency Department/TARN, Salford Royal Hospitals NHS Foundation Trust, Salford, UK
  1. Correspondence to Dr Anisa Jabeen Nasir Jafar, HCRI, University of Manchester, Manchester M15 6JA, UK; anisa.jafar{at}manchester.ac.uk

Abstract

Background In 2017, the WHO produced its first minimum data set (MDS) for emergency medical team (EMT) daily reporting during the sudden-onset disasters (SODs), following expert consensus. The MDS was deliberately designed to be simple in order to improve the rate of data capture; however, it is new and untested. This study assesses the inter-rater agreement between practitioners when performing the injury aspect of coding within the WHO EMT MDS.

Methods 25 clinical case vignettes were developed, reflecting potential injuries encountered in an SOD. These were presented online from April to July 2018 to practitioners who have experience of/training in managing patients in SODs The practitioners were from UK-Med’s members, Australian Medical Assistance Team’s Northern Territory members and New Zealand Medical Assistance Team members. Practitioners were asked to code injuries according to WHO EMT MDS case classifications. Randolph’s kappa statistic for free-marginal multirater data was calculated for the whole dataset as well as subgroups to ascertain inter-rater agreement.

Results 86 practitioners responded (20.6% response rate), giving >2000 individual case responses. Overall agreement was moderate at 67.9% with a kappa of 0.59 (95% CI 0.49 to 0.69). Despite subgroups of paramedics (kappa 0.63, 95% CI 0.53 to 0.72), doctors (kappa 0.61, 95% CI 0.52 to 0.69) and those with disaster experience (kappa 0.62, 95% CI 0.52 to 0.71) suggesting slightly higher agreement, their CIs (and those of other subgroups) suggest overall similar and moderate levels of practitioner agreement in classifying injuries according to the MDS categories.

Conclusions An inter-rater agreement of 0.59 is moderate, at best, however, it gives ministries of health some sense of how tightly they may interpret injury data derived from daily reports using WHO EMT MDS. Furthermore, this kappa is similar to established but more complex (thus more contextually impractical) injury scores. Similar studies, with weighting for injury likelihood using sample data from SODs would further refine the level of expected inter-rater agreement.

  • disaster planning and response
  • data management
  • global health
View Full Text

Statistics from Altmetric.com

Footnotes

  • Twitter @EMergeMedGlobal

  • Contributors AJNJ developed the concept of the study, designed the methodology, conducted the study, analysed the data and wrote the manuscript. JCS helped develop the methodology of the study, developed the data analysis plan and contributed to the manuscript. FL helped develop the methodology of the study and contributed to the manuscript

  • Funding This work was supported by the Royal College of Emergency Medicine & Hong Kong Jockey Club Charities Trust who supported the first author’s PhD which generated this study.

  • Competing interests AJNJ has previously provided consultancy to the UK EMT & contributed to the WHO EMT MDS working group.

  • Patient consent for publication Not required.

  • Ethics approval The University of Manchester ethical review manager granted approval for this low risk study on 22 January 2018 with reference 2018-3577-4808.

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Data availability statement Data are available on reasonable request. Please contact the corresponding author for any further details on the study.

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.