Developing indicators for emergency medical services (EMS) system evaluation and quality improvement: a statewide demonstration and planning project

Jt Comm J Qual Improv. 2001 Mar;27(3):138-54. doi: 10.1016/s1070-3241(01)27013-8.

Abstract

Background: The state of California, like every other state, has no system for assessing the quality of prehospital emergency medical services (EMS) care. As part of a statewide project, a process was designed for the evaluation and quality improvement (QI) of EMS in California. Local EMS agency (LEMSA) representatives made a commitment to submit data from both the providers and the hospitals they work with.

Indicator selection and development: For conditions such as cardiac chest complaints, standardized indicators had already been developed, but for many other areas of interest there was either little literature or little consensus in the literature. Definitional differences were often linked to local-practice protocol differences. A related comparison challenge lay in the fact that care protocols may differ across systems. Some aspects of care may not be offered at all, which may reflect resource shortages or variable medical direction.

Data collection procedures: Each indicator was precisely defined, and definition sheets and data troubleshooting report forms were provided to participants in three data-collection rounds. Participants were given 1 month to collect the data, which consisted of summary-level elements (for example, average time to defibrillation for all patients 15 years or older who received defibrillation in 1998). Data were then aggregated, analyzed, and prepared for display in graphs and tables.

Access and measurement issues: Numerous data collection problems were encountered. For example, not all participants could actually access data that they thought would be available. Linking data on patients as they travel through the continuum of EMS care (dispatch, field, hospital) and linking EMS data to hospital outcomes was also difficult. Yet even when data were easily available, challenges arose. The need for specificity, the potential misfit between definitions and the available data, and the challenges of data retrieval remained salient for the duration of the project and made cross-LEMSA and cross-provider comparison problematic.

Recommendations and lessons learned: The project led to formal policy recommendations regarding development of a state-defined minimum data set of structure, process, and outcome indicators and their associated data elements; provision in the minimum data set for both local-level and statewide indicators; and provision of technical assistance at the local-provider level.

Epilogue: Since the project's conclusion in June 2000, many regional and local EMS groups have begun to collect data on indicators. Many of the project's recommendations have been incorporated into the work plan of the state's System Review and Data Committee.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adult
  • Ambulances / standards*
  • California
  • Child
  • Data Collection / methods
  • Emergency Medical Services / standards*
  • Humans
  • Management Information Systems
  • Medical Record Linkage
  • Outcome and Process Assessment, Health Care
  • Pilot Projects
  • Quality Indicators, Health Care*
  • Total Quality Management / organization & administration*