Evidence-based care uses the best research and individual expertise to make decisions with patients. Although much of the focus to date has been on generating evidence from high-quality research such as randomized clinical trials (RCTs)1, the next critical step is to ensure that the evidence is applied to the care of patients. This article used an evidence-based practice guideline and a multidisciplinary team to improve the care of children with osteomyelitis.
Osteomyelitis is a relatively common pediatric orthopaedic issue that has become increasingly problematic because of the emergence of virulent and treatment-resistant organisms such as methicillin-resistant Staphylococcus aureus (MRSA). Osteomyelitis can be classified as complicated or uncomplicated. Uncomplicated osteomyelitis typically has a short duration of symptoms, minimal local clinical signs, and no radiographic changes, and responds rapidly to antibiotics. Complicated osteomyelitis presents late or has a short, rapidly progressive course, with local signs of an abscess, systemic signs of infection, extensive soft-tissue involvement, and radiographic changes, and usually requires operative intervention with prolonged antibiotic therapy. Failure to adequately treat osteomyelitis not only leads to prolonged hospitalization, but also may result in permanent disability and/or chronic infection. As evidenced by the experience of the hospital in this report, the treatment of osteomyelitis is variable among individuals, institutions, and regions. Thus, the application of best evidence has the potential to have a significant impact on outcome for children with osteomyelitis.
Clinicians can easily be overwhelmed by the amount of clinical evidence. Furthermore, clinicians may not have the time or expertise to critically appraise and to identify the best evidence. Practice guidelines are one strategy to provide evidence in a usable format for clinicians. Although elements of effective guidelines are beyond the scope of this commentary and require formal methods of appraisal as a key such as the Appraisal of Guidelines for Research and Evaluation (AGREE) or Grading of Recommendations Assessment, Development and Evaluation (GRADE), they include a focus on a high quality of evidence, a systematic approach to identifying and appraising the evidence, and the involvement of a multidisciplinary and expert group with broad input into the final product. However, even with appropriate guidelines, sometimes the challenge is changing physician behavior.
The four steps in implementing guidelines are awareness, agreement, adoption, and adherence2. All of these steps require attention and, although they are also beyond the scope of this commentary, generally require multiple strategies for implementation3.
In evaluating this article, several issues need consideration. First, the improvements observed by Copley et al. may have been due to the daily rounding and the multidisciplinary team rather than the guideline itself. Second, irrespective of the source of the benefits, not all centers have the staff and/or commitment to replicate their methods. Third, simply doing research on a topic, such as the presence of a research coordinator no matter how surreptitious, may lead to improvements, a phenomenon called the Hawthorne effect. Fourth, the authors did not provide complete details on how they made the diagnosis of osteomyelitis. Although it may be obvious in patients undergoing surgery, the diagnosis may be less clear in patients with uncomplicated osteomyelitis, leading to uncertainty about their study patients. Fifth, the surgical intervention rate in this study was high. Although this rate may reflect the high prevalence of MRSA, there were no criteria for when to operate and perhaps some of their patients would have improved without surgery. Sixth, at least for uncomplicated osteomyelitis, recent evidence from randomized trials would suggest that a shorter duration (a total of three weeks) of antibiotics is sufficient4. Seventh, the guideline development process is not described well and appears to have relied on consensus for some aspects of the guideline. Although expert clinical input is required, a truly evidence-based guideline requires high-quality Level-I and Level-II evidence that is lacking for much of the care of patients with osteomyelitis. Eighth, the guideline was evaluated with use of historical controls in a Level-III retrospective study. More convincing evidence for the effectiveness of the guideline would require an RCT. Finally, similar to evaluating a treatment, the determination of a guideline’s effectiveness often requires consideration of multiple outcomes. This article focused on what are often termed “process measures,” such as use of tests and duration of antibiotic use. In addition, the authors provided information on costs and the main driver of costs, the length of hospital stay. From these perspectives, their guideline was effective, having improved the diagnostic workup and having provided a higher rate of organism identification and more consistent use of antibiotics, with a trend toward lower readmission rates and decreased lengths of hospital stay. What is lacking, however, is important clinical outcomes such as complete eradication of the infection, return to school or play, lower disability, and better quality of life.
In conclusion, this study evaluated the important step of guideline implementation in the pathway from generating evidence to improving patient outcomes. Although the evidence base for the guideline was not strong and the guideline evaluation did not consider patient outcomes, this guideline probably improved patient outcome and should be implemented more broadly.