Article Text

Download PDFPDF
Implementation research in emergency medicine: a systematic scoping review
  1. Emma J Tavender1,2,
  2. Marije Bosch1,2,
  3. Michelle Fiander3,
  4. Jonathan C Knott4,5,
  5. Russell L Gruen1,2,6,
  6. Denise O'Connor7
  1. 1Department of Surgery, Central Clinical School, Monash University, Melbourne, Australia
  2. 2National Trauma Research Institute, The Alfred & Monash University, Melbourne, Australia
  3. 3Consultant Information Specialist, Ottawa, Canada
  4. 4Department of Medicine and Radiology, Melbourne Medical School, The University of Melbourne, Melbourne, Australia
  5. 5Department of Emergency Medicine, Royal Melbourne Hospital, Melbourne, Australia
  6. 6Department of Trauma, The Alfred Hospital, Melbourne, Australia
  7. 7Australasian Cochrane Centre, School of Public Health and Preventative Medicine, Monash University, Melbourne, Australia
  1. Correspondence to Emma J Tavender, Department of Surgery, Central Clinical School, Monash University, Melbourne, Australia; emma.tavender{at}monash.edu

Abstract

Introduction Implementation research aims to increase the uptake of research findings into clinical practice to improve the quality of healthcare. This scoping systematic study aims to assess the volume and scope of implementation research in emergency medicine (EM) to obtain an overview and inform future implementation research.

Methods Studies were identified by searching electronic databases and reference lists of included studies for the years 2002, 2007 and 2012. Titles/abstracts were screened, full papers checked and data extracted by one author, with a random sample checked by a second author.

Results A total of 3581 citations were identified with 197 eligible papers included. The number of papers significantly increased over time from 26 in 2002 to 77 in 2007 and 94 in 2012 (p<0.05). Eighty-two (42%) focused on identifying evidence–practice gaps, 77 (39%) evaluated the effectiveness of implementation interventions and 38 (19%) explored barriers and enablers to change. Only two papers explicitly stated that theory was used. Five of the 77 effectiveness studies used a randomised design and few provided sufficient detail about the intervention undergoing evaluation.

Conclusions Although there was a significant increase in the number of implementation research papers, most studies focused on identifying evidence–practice gaps or used weak study designs to evaluate the effects of implementation interventions. Recommendations for improving implementation research in EM include identifying barriers and enablers to implementation, using theory in areas where proven important gaps exist, improving the reporting of the content of interventions and using rigorous study designs to evaluate their effectiveness.

  • performance improvement
  • quality

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Linked Articles