Article Text

The development and evaluation of an evidence-based guideline programme to improve care in a paediatric emergency department
  1. Ayobami T Akenroye1,2,
  2. Anne M Stack1,2
  1. 1Division of Emergency Medicine, Boston Children's Hospital, Boston, Massachusetts, USA
  2. 2Department of Pediatrics, Harvard Medical School, Boston, Massachusetts, USA
  1. Correspondence to Anne M Stack, Boston Children's Hospital, 300 Longwood Avenue, Boston, MA 02115, USA; anne.stack{at}


Introduction Care guidelines can improve the quality of care by making current evidence available in a concise format. Emergency departments (EDs) are an ideal site for guidelines given the wide variety of presenting conditions and treating providers, and the need for timely decision making. We designed a programme for guideline development and implementation and evaluated its impact in an ED.

Methods The setting was an urban paediatric ED with an annual volume of 60 000. Common and/or high-risk conditions were identified for guideline development. Following implementation of the guidelines, their impact on effectiveness of care, patient outcomes, efficiency and equitability of care was assessed using a web-based provider survey and performance on identified metrics. Variation in clinical care between providers was assessed using funnel plots.

Results Eleven (11) guidelines were developed and implemented. 3 years after the initiation of the programme, self-reported adherence to recommendations was high (95% for physicians and 89% for nurses). 97% of physicians and 92% of nurses stated that the programme improved the quality of care in the ED. For some guidelines, provider-to-provider care practice variation was reduced significantly. We found reduced disparity in imaging when assessing one guideline. There were also reductions in utilisation of diagnostic tests or therapies. As a balancing measure, the percentage of patients with any of the guideline conditions who returned to the ED within 72 h of discharge did not change from before to after guideline initiation. Overall, 80% of physician and 56% of nurse respondents rated the guideline programme at the highest value.

Conclusions A programme for guideline development and implementation helped to improve efficiency, and standardise and eliminate disparities in emergency care without jeopardising patient outcomes.

  • emergency care systems
  • emergency care systems, efficiency
  • emergency department
  • paediatric emergency med
  • paramedics, guidelines

Statistics from

Key messages

What is already known on this subject?

  • Clinical guidelines can be used to standardise care and reduce unnecessary resource utilisation. Despite the presence of national or local guidelines, adoption by physicians can be challenging.

What might this study add?

  • We developed and implemented a set of evidence-based guidelines that were readily adopted by a large group of clinicians. Use of guidelines led to decreased variation in care and resource utilisation. Clinicians rated the programme highly.


Evidence-based guidelines (EBGs), have been increasingly applied in the healthcare setting with the goals of reducing variation and improving care.1 By standardising care processes, patient outcomes may be improved and care can be studied and modified based on outcomes.2 The development of a guideline, though time-consuming and resource-intensive,3 ,4 is however a single step in the process of improving patient care. EBGs are only useful if there is effective translation to the bedside.2 ,4–6 It is therefore essential that they are fully adopted and used in the local setting. However, implementation poses a challenge.7–12 Furthermore, implementation in a complex environment of care such as an emergency department (ED) can be even more challenging. There is often high acuity, high patient volume, concurrent demand for timely care in acutely ill patients and variability in providers. Like many other academic settings, ours has a large number of physicians and nurses with a range in experience and training. We recognised the need to reduce variation in care and improve efficiency. The objective of this paper is to describe how we developed and/or adopted EBGs specific to paediatric emergency care, the implementation strategy used and the methods applied to evaluate the success of the programme. We also provide information on achievements in standardising care and decreasing resource utilisation. Moreover, we share the lessons learnt from the establishment of this programme that may be helpful to clinicians or administrators interested in developing a local improvement portfolio.



The Boston Children's Hospital ED is a tertiary, university affiliated paediatric hospital with an annual volume of 60 000 patients and an admission rate of ∼18%. Providers include 45 paediatric emergency medicine specialists and 25 general paediatricians, 72 nurses, 16 emergency medicine fellows and about 200 rotating residents. Guidelines for many common paediatric emergency conditions did not exist or were not formally adopted locally if national guidelines existed.

Human subjects protection

According to the policy for activities that constitute research at Boston Children's Hospital, this work met criteria for operational improvement activities exempt from ethics review.

Planning the intervention

EBG team

Leadership support was critical to the development of an EBG programme in the ED. The EBG team was made up of division leaders including the division chief and clinical chief; individual EBG champions consisting of a paediatric emergency medicine physician and a nurse partner; quality improvement (QI) specialist, data analyst, ED electronic medical record (EMR) liaison, ED pharmacist and administrative support. Support and input from providers in subspecialty departments was sought as appropriate.

Building an organisational culture of evidence-based practice

The EBG programme was built on the evolving normative culture of evidence-based practice in the department accomplished by:

  • Clear communication to providers on the expectation of providing evidence-based care. This was done by: (A) discussion of current and emerging evidence on case management at the physicians’ weekly 4-h conference; (B) development of a local evidence-based medicine (EBM) website that houses key publications in emergency medicine updated at least yearly by faculty for their respective EBM topics; and (C) monthly ‘Journal Watch’ where pertinent articles recently published in peer-reviewed journals are discussed.

  • Nurturing of openness and transparency including monthly morbidity and mortality conference where providers critically evaluate care in a collegial environment.

  • Creation of robust support mechanisms to improve evidence-based practice such as through the use of computerised order entry and employment of quality personnel.

  • Creation of a nimble, didactic conference environment where ideas are shared and support is garnered.

Developing the EBG

From preliminary data, ED leadership had identified variation in care practices and wide differences in resource utilisation for certain conditions. Following the decision to initiate a local EBG programme, physicians were encouraged to come forward with candidate conditions for guidelines. Conditions were prioritised based on: number of patients affected; potential for high impact such as a disease with high morbidity and/or mortality; potential for elimination of variation in practice patterns and potential for improved efficiency through the reduction of unnecessary resource utilisation.

The EBG development process as designed by the division leaders is as follows: the chief and clinical chief of the division met with the EBG physician champion to discuss the general principles of guideline development using the AGREE instrument as a guide,13 and provided an orientation to the development process and specific considerations for the guideline, for example the eligible patient population. The scope of the guideline was established using PICO framework (patient/problem, intervention, comparison, outcomes).1 National guidelines, if available, were studied by the physician and updated with current evidence. They were also adapted to fit the local context and we laid emphasis on areas that we had identified with significant evidence-to-practice gaps in our ED. For instance, we added contact precautions to our bronchiolitis guideline for management of patients with suspected/confirmed respiratory syncytial virus (RSV) infection. The major and up-to-date recommendations of each guideline were kept as it is. We did, however, empower the attending physician to deviate from the guideline for individual cases as appropriate. To this effect, we adapted the American Academy of Pediatrics guidelines on classifying EBG recommendations,14 and used language suggestive of the strength of the evidence. For instance, the word ‘should’ was used where there was strong evidence supporting the recommendation whereas in areas where evidence was equivocal the words ‘may’ or ‘consider’ were used to allow flexibility with the aim of learning from the clinical expertise and rationale of providers for their management options.

Next, at least five physician and five nurse colleagues and/or subspecialists were engaged to seek their expert opinion on the scientific soundness and usability of the guideline. Once a draft algorithm was created, the recommendations and supporting evidence were discussed in departmental conference and helped to gather local support. On the internal website, providers could also access the EBM folder which housed the supporting literature. Prior to implementation, pharmacist review for medication choice and accuracy was completed. The ED EMR physician liaison was crucial to the success of supporting computerised order entry. Some EBG champions preferred an ‘all-inclusive’ order set with many options while others preferred a more streamlined approach such that physicians were guided to choose the recommended care path. We allowed latitude here and hope in future iterations to study the impact of the stringency of the order set on recommendation adherence. Prior to implementation, completed guidelines included an entire package of deliverables: the care algorithm, an electronic order set, comprehensive discharge instructions and quality measures.

Implementation strategies

Initially, once the guideline package was ready, a semistructured process was followed for implementation. However, it soon became clear that provider awareness was lacking. To address this, a novel EBG Implementation Team was formed to ensure successful translation to practice. The team consisted of two physicians, two nurses, a QI expert, a data analyst, and an administrator with the support of the clinical chief and individual EBG champions. To guide the choice of implementation strategies, the group reviewed existing implementation science extensively and employed the strategies most likely to be effective given our local setting. Using the Pathman (awareness-agreement-adoption-adherence) model, we developed a structured process for implementation15 (figure 1). The implementation process proceeded in multiple iterations until compliance data showed that recommendations had been embedded into practice as evidenced by a stable improvement over many months to years.

Figure 1

Process for implementation.

Strategies used to improve awareness of and adherence to EBG recommendations included presentations at physician and/or nursing meetings, use of posters in high traffic locations, development of binders with laminated copies of each guideline, one-on-one discussions with providers, web-based videos, pocket cards with algorithms and regular reports of performance to champions and to the entire division.

Programme evaluation

Provider survey

To assess the impact of the guideline programme, we developed a simple 14-question anonymous web-based survey (see online supplementary appendix A) to assess the overall impact of the EBG programme, adoption of recommendations and effectiveness of the implementation strategies 2 years after the initiation of the guideline programme. The survey was created by the improvement team and piloted by distributing to five providers for flow and ease of use. It was distributed 3 months after roll-out of the first eight EBGs, which was about a year after the first EBG was rolled out, to allow for a wash-in period of guideline use. Data were analysed using descriptive statistics for the 10 questions with either dichotomous or 4-point or 5-point Likert scale responses. Comments from the final four questions were collated and categorised into themes and feedback was given to the QI team during a presentation and to providers via email. Detailed survey results were also sent to the EBG champions.

Development and use of quality measures

To assess the clinical impact of the guidelines, three to eight quality measures including balancing measures for each were developed (see online supplementary appendix B). Candidate measures were suggested by the EBG champions and vetted for importance and feasibility by the EBG team. Efforts were made to balance quality measures across the framework of structure, process and outcome16 as well as Institute of Medicine domains.17 Due to the episodic nature of ED care, we chose returns to the ED resulting in hospital admission within 72 h of discharge as our primary outcome measure for each guideline.18 Detailed measurement plans were developed and served as comprehensive guides for data extraction (online supplementary appendix C shows a sample plan). Every week, the clinical chief, QI expert and data analyst met to discuss the best method to extract the data and to define the measures. We involved the respective EBG champions as appropriate. The process to achieving the most accurate data for some measures took months while others were relatively easy.

We monitored performance monthly using statistical process control charts. Control charts display and analyse time series data.19 These charts allowed a means of studying process variation and identifying when a process changed following an intervention.20–22 Unlike traditional statistical methods, control charts demonstrating healthcare improvement efforts23–28 can reflect in a relatively short time the impact of an intervention or occurrence on a process and reduce the chances of spuriously attributing a change to an intervention. Control chart rules help differentiate variation due to a ‘special cause’ such as an improvement project from that due to underlying random variation (‘common cause’).19 ,21

To assess the impact of a guideline on disparities in care, another measure of standardisation, we evaluated the proportion of patients with minor head injury from the different payer and racial/ethnic groups undergoing brain imaging before and after the Minor Head Trauma EBG was implemented.

Whenever we observed a non-random or ‘special’ cause indicative of poorer performance, we gave booster doses of improvement strategies to reinvigorate the process, such as one-on-one discussions, emailing specific clinicians, addressing misconceptions at our weekly teaching meetings, awareness and campaigns, and putting up fresh poster reminders.

Individual physician performance

Another crucial part of EBG programme analysis was the addition of individual performance measures to the standing annual individual performance report shared confidentially with each attending by the division chief. We used funnel plots to display provider to provider variation for selected measures. Funnel plots are a particularly illustrative way to show whether true variation is present when there are low numbers since a single patient could impact the event rate of a provider with low patient volume significantly.19 Using a variance comparison test, we tested for a difference in variation before and after guideline implementation. In these reports, individual physicians were able to visualise their performance in comparison to deidentified peers.


Based on the EBG selection criteria previously discussed, 11 EBGs were implemented over a 1 year period (box 1).

Box 1

List of guidelines developed

  • Abscess

  • Anaphylaxis

  • Bronchiolitis

  • Chest pain

  • Croup

  • Ectopic pregnancy

  • Gastroenteritis

  • Intussusception

  • Minor head trauma

  • Syncope

  • Urolithiasis

Results of provider survey

The survey used to evaluate the success of implementation had a response rate of 70% (n=60) for physicians and 50% (n=36) for nurses. Table 1 shows the results.

Table 1

Provider survey

There was good self-reported adherence with 95% of physicians and 89% of nurses reporting using the EBG when caring for a patient with an EBG condition. Adherence was corroborated by data collected on the different measures.29 Atypical disease presentation, resident orders due to lack of awareness of guideline recommendations and requests by referring physicians were cited as common reasons for non-compliance (data available on request). Physicians found the on-line and posted algorithms very or somewhat helpful (78% and 79%, respectively) as effective implementation strategies and nurses found the posted algorithms helpful (69%). Physicians and nurses found, in over 90% of responses, that the programme improved the quality and standardisation of care. Overall, the majority of providers rated the guideline programme at the most favourable level on a 4-point Likert scale.

Performance on individual physician and other quality measures

Our EBG programme led to changes in practice and/or significant reductions in resource utilisation and costs. We showed, using time series methodology, a decrease in resource utilisation (radiographs, viral testing and β-agonist therapy) for patients with bronchiolitis without a change in outcomes.29 In addition, we were able to decrease resource utilisation (blood testing and intravenous fluid) and increase rates of pregnancy testing in pubertal girls and ECG testing for patients with syncope.30 In a third publication demonstrating the effectiveness of the programme, we were able to decrease rates of brain imaging for patients with minor head injury, and at a rate faster than that of the national decline.31 Selected additional measures, where there were reductions in resource utilisation and/or improved practice, are shown in figures 2 and 3 (and see online supplementary appendices D and E).

Figure 2

Funnel plots showing individual physician rates, before and after guideline initiation, for: (A) ordering chest radiographs in patients with bronchiolitis; and (B) giving intravenous fluid to patients with gastroenteritis.

Figure 3

Rates of CT scanning for patients with minor head injury, (A) by insurance type; and (B) by racial/ethnic group.

We also eliminated disparities in some areas. For instance, we found that prior to the roll-out of the minor head injury EBG, patients with minor head injury who were Caucasian and who had private insurance were more likely to have a brain CT scan in comparison to African Americans, Hispanics or children with public insurance. Through the EBG, we reduced the overall rate of head imaging and were able to eliminate these disparities (figure 3).

The individual performance report was particularly helpful in ‘pulling’ outliers towards the group average, which is likely more reflective of the care appropriate for our patient cohort rather than individual preferences (figure 2). As balancing measures, there were no adverse events or increase in the rate of returns to admission within 72 h for any of these conditions after 3 years of initiation of the guideline programme. (The control chart for the croup EBG return rate appears in see online supplementary appendix F.)

Costs of the programme

The initial costs of developing or implementing the guideline were relatively small and entirely in kind. Most of the direct cost was for creating posters displaying algorithms or basic recommendations of each EBG or performance. Otherwise, costs were indirect such as the clinicians’ time spent in the initial development and maintenance of the algorithm.

Use of guidelines resulted in decreases in resource utilisation (eg, syncope and minor head trauma) and quantitative analysis of cost savings are currently underway. However, it is important to note that while many of the guidelines recommended against testing or treatment, some advocated for certain evidence-based therapies, such as the use of corticosteroids in patients with croup and the monitoring of patients with anaphylaxis for at least 4 h prior to discharge from the ED, and might have led to increased resource utilisation.


We present here a practical process for developing an EBG programme in a paediatric ED. The major lessons from this project were:

  1. A local guideline programme is helpful in improving the value of care.

  2. Developing guidelines is never, in the strict sense, complete since new evidence would usually emerge. Guidelines will need to be updated as appropriate.

  3. Implementation could be considered the most important and challenging phase of a guideline programme. Adequate support vis-à-vis personnel and materials, as well as a focus on most appropriate strategies given the local setting are crucial to successful implementation.

  4. Newer initiatives will not necessarily jeopardise the success of other ongoing initiatives, if well synchronised.

We created a robust yet streamlined process for guideline development. By using local experts to either adapt a national or develop a new guideline, we were able to leverage peer influence in driving adoption.2 ,32 Through this local effort, we were able to avoid barriers to guideline implementation such as incompatibility of recommendations with local values, lack of credibility of guideline developer(s) with end users and recommendations not well integrated into the workflow.7 ,8 ,33 ,34 Local experts were able to develop guidelines that were end-user sensitive yet evidence-based. Peers were also able to offer their expert opinions of the recommendations to the local developers. This was helpful in improving the value, buy-in and compliance with the EBG.

There were many challenges in algorithm development. Although we capitalised on existing resources to develop EBGs, considerable time was expended. One guideline leader estimated almost 40 h over a 6-month period for the development of the algorithm alone. The substantial time invested by the EBG team, EMR liaison and others was not measured. Furthermore, we learned from the survey that some nurse champions would have liked involvement earlier in the process and more guidance in understanding their role. Earlier and more directed nursing involvement and more generous use of wall displays may have improved awareness and adoption by nurses.

Since EBGs are dynamic, with the possibility for new evidence emerging after guideline introduction,3 it is important to have a plan for keeping guidelines current. We have instituted an annual review process for the guideline owners.

Implementation can be the most challenging aspect of the use of guidelines.7–12 We quickly learned that substantial effort was required for successful implementation in a complex environment with hundreds of providers. Strategies proven to be effective in guideline implementation were chosen.7 ,9 ,10 ,12 ,34–39 For instance, as much as possible we avoided passive strategies such as didactic sessions and instead used active strategies such as interactive sessions and one-on-one discussions.11 ,12 Different strategies were also effective at different stages of implementation: awareness-agreement-adoption-adherence.15 For example, the interactive sessions at the physicians’ weekly conference were successful in raising awareness of EBG recommendations as well as in increasing the likelihood of agreement with individual recommendations. However, to ensure adoption and adherence, supportive systems such as a computerised order entry, electronic prompts and feedback, were very important.2

Just as shown by previous studies,40 ,41 the survey revealed that the effectiveness of the implementation strategies varied between and within groups. For instance, most nurses found the posted algorithms very effective while most physicians generally thought the online algorithms were very effective. It is therefore important to tailor strategies to those most effective among a group. The use of multifaceted interventions is also important such that at least one would appeal to a provider.42 It was however particularly challenging to improve nursing engagement across the spectrum of activities in the ED and has not been limited to the EBG programme per se. Reasons for the low nursing response rate include, but are not be limited to, a large proportion of part time nurses, a change in hospital policy that eliminated reimbursement for meeting time and the preference for print or face-to-face contact as the primary mode of communication by most nurses, unlike most physicians in the department who used emails as their primary mode of communication. Furthermore, physicians have a weekly 4-h meeting that provides an opportunity to discuss important topics like EBGs with no equivalent nursing meeting where a significant proportion of nurses are present. We continue to work with nursing leadership in how to improve protected time for nurses to attend meetings. We have also employed some innovative methods such as the ‘Education Tree’ whereby specific nurses well engaged with the EBG programme are assigned three to four nurses whom they will be responsible for in ensuring they are up to date with EBG recommendations. The high number of rotating residents also presents a challenge to guideline adherence. Ongoing training is underway.

Interestingly, although potentially risky, we found a synergistic effect by rolling out the guidelines in quick succession. It is possible that the introduction of each successive guideline kept awareness of the EBG programme fresh for providers. The implementation of the guidelines did not extinguish improvements seen in concurrent projects in the department, such as improving care for patients with sepsis;43 ,44 rather we believe the EBG programme may have further reinforced the department's improvement culture. It may be helpful to introduce guidelines that have similar recommendations and can benefit from such a synergistic effect such as reduction of utilisation of ionising radiation.

The health information system of the hospital was crucial. We have the benefit of access to a robust data warehouse and were able to measure many components easily. Others were more challenging to measure and required chart review or measure revision to improve feasibility.

One challenge unique to an ED is in identifying meaningful outcomes given the episodic nature of the care we provide. However there were significant opportunities to improve value—improving efficiency, equity and costs.

Limitations of this effort include the possibility that guidelines were created in which local conventions trump evidence. However, we relied on the adaptation of national guidelines, when available, complete and thorough peer review, the academic strength of the physician and nurse leaders, and oversight by division leaders to mitigate this possibility.


We have shown that it is possible to develop a successful and valuable guideline programme in a complex setting as an ED. The keys to success of the programme were strong leadership support and local presence of guidelines, selection of motivated champions, development of practical process for guideline development and implementation, peer consensus among a highly academic group, and rigorous performance monitoring with frequent feedback to stakeholders. We believe that the fundamentals of this EBG programme may be adapted to other clinical settings.


The authors thank the Chief of the Division, Dr Richard Bachur, the EBG Champions, Implementation Team, and all the emergency doctors and nurses for their dedication to the EBG project. The authors also thank Jonathan Finkelstein for his helpful comments on previous drafts of this manuscript and Stephanie Parver for assistance with manuscript preparation.


Supplementary materials


  • Contributors ATA contributed to the conception and design, analysis and interpretation of data, drafted the initial manuscript, and approved the final manuscript submitted. AMS contributed to the conception and design, analysis and interpretation of data, reviewed and revised the manuscript, and approved the final manuscript submitted.

  • Competing interests None declared.

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Data sharing statement There are no unpublished data for development and implementation.

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.