Article Text

Download PDFPDF
Predicting emergency department admissions
  1. Justin Boyle1,
  2. Melanie Jessup2,3,
  3. Julia Crilly3,
  4. David Green3,
  5. James Lind3,
  6. Marianne Wallis2,3,
  7. Peter Miller4,
  8. Gerard Fitzgerald5
  1. 1CSIRO Information and Communication Technologies Centre, Level 5, UQ Health Sciences Building, Royal Brisbane and Women's Hospital, Herston, Queensland, Australia
  2. 2Research Centre for Clinical and Community Practice Innovation, Gold Coast Campus, Griffith University, Gold Coast, Queensland, Australia
  3. 3Gold Coast Hospital Emergency Department, Queensland Health, Gold Coast Hospital, Southport, Queensland, Australia
  4. 4Toowoomba Hospital Emergency Department, Queensland Health, Toowoomba, Queensland, Australia
  5. 5Faculty of Health, Queensland University of Technology, Brisbane, Queensland, Australia
  1. Correspondence to Dr Justin Boyle, Research Scientist, CSIRO Information and Communication Technologies Centre, Level 5, UQ Health Sciences Building, Royal Brisbane and Women's Hospital, Herston, 4029, Queensland, Australia; justin.boyle{at}csiro.au

Abstract

Objective To develop and validate models to predict emergency department (ED) presentations and hospital admissions for time and day of the year.

Methods Initial model development and validation was based on 5 years of historical data from two dissimilar hospitals, followed by subsequent validation on 27 hospitals representing 95% of the ED presentations across the state. Forecast accuracy was assessed using the mean average percentage error (MAPE) between forecasts and observed data. The study also determined a daily sample size threshold for forecasting subgroups within the data.

Results Presentations to the ED and subsequent admissions to hospital beds are not random and can be predicted. Forecast accuracy worsened as the forecast time intervals became smaller: when forecasting monthly admissions, the best MAPE was approximately 2%, for daily admissions, 11%; for 4-hourly admissions, 38%; and for hourly admissions, 50%. Presentations were more easily forecast than admissions (daily MAPE ∼7%). When validating accuracy at additional hospitals, forecasts for urban facilities were generally more accurate than regional forecasts (accuracy is related to sample size). Subgroups within the data with more than 10 admissions or presentations per day had forecast errors statistically similar to the entire dataset. The study also included a software implementation of the models, resulting in a data dashboard for bed managers.

Conclusions Valid ED prediction tools can be generated from access to de-identified historic data, which may be used to assist elective surgery scheduling and bed management. The paper provides forecasting performance levels to guide similar studies.

  • Admitting department
  • hospital
  • patient admission
  • forecasting
  • emergencies
  • crowding
  • management
  • emergency department management

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Footnotes

  • Competing interests None.

  • Ethics approval Initial model development and validation: The Gold Coast Health Service District Human Research Ethics Committee & Toowoomba Health Service District Human Research Ethics Committee. Subsequent multi-site validation: Queensland Health, Central Office Committee Human Research Ethics Committee.

  • Provenance and peer review Not commissioned; externally peer reviewed.