Evaluation of a Schizophrenia Medication Algorithm in a State Hospital

Wei-Chung Chuang, M. Lynn Crismon

Disclosures

Am J Health Syst Pharm. 2003;60(14) 

In This Article

Discussion

After the schizophrenia treatment algorithm was disseminated, physician documentation changed in some areas, but no overall major changes were observed. Although the rate of documentation of the presence or absence of target symptoms increased, it was still low. In the post-algorithm group, improved documentation was observed for only one positive symptom (i.e., unusual thought content). As for the negative symptoms, emotional expression was the most documented symptom at 37%, while the remaining symptoms were documented less than 20% of the time.

While these results are discouraging, they are similar to the results of other studies that rely on education alone to implement guidelines. However, few studies have focused on education as the only intervention, and very few have actually measured outcomes.[15]

In psychiatric practice, placing patients in the appropriate algorithm stage can be inherently difficult because clinical histories gathered from patients and documentation of patient medication histories may not be reliable. Patients with severe mental illnesses may be experiencing symptoms or cognitive dysfunction that limit their ability to report past treatment responses, and they may not have well-informed or available care-givers.[16] Despite these obstacles, appropriate documentation should be a priority, since it is essential for determining patient outcomes and the effectiveness of pharmacotherapy. Lack of consistent documentation decreased our ability to determine physician adherence to the algorithm. First, there was no documentation of patients' initial algorithm stage; therefore, several objectives (i.e., time spent in a particular treatment stage without improvement and percentage of patients moved to the next appropriate stage in the algorithm) could not be accurately evaluated. Second, physicians' intentions and reasons for maintaining or changing medications could not be readily inferred. For example, in patients that were discharged on both oral haloperidol and haloperidol decanoate, it is difficult to discern whether the physician's intent was for the oral haloperidol to be discontinued in the future or if the patient was to remain on both forms indefinitely. This lack of information in the discharge summary makes it difficult for the outpatient-treating clinician to know the goals of treatment.

Incomplete documentation of clinical data obscures the identification of errors regarding algorithm-recommended strategies, which, along with the complexity of some algorithms, confounds the evaluation of clinician adherence.[2] Accomplishing the goal of improving the quality of patient care through evidence-based practices includes monitoring clinician adherence to guidelines and providing feedback. Therefore, successful interventions should employ systems that provide information on performance and outcomes that are related to the behavioral change. Goldberg et al.[14] conducted a randomized controlled trial to determine the effectiveness of academic detailing and continuous quality improvement (CQI) teams in increasing compliance with national guidelines for treating hypertension and depression. Members of the CQI teams were given handbooks on how to improve quality using team-based approaches but were not provided with documentation forms or instructions on what data to collect when making and monitoring changes recommended by the algorithms. Not surprisingly, they found real-time collection of data difficult. The ineffectiveness of both the academic detailing and CQI teams in that study reinforces the idea that enabling methods, such as documentation forms, are needed to facilitate the recording of physician interventions in order to review the data used in the decision-making process. In the study presented here, the Clinical In-patient Record Progress Note, a core component of the algorithm program, was not used. This was true despite the fact that the form had been approved by the statewide medical records committee and its use was strongly encouraged by the state medical director during onsite training.

Bettinger et al.[17] assessed physician adherence to the TIMA for major depressive disorders in two Texas community mental health centers. Patient outcome measures and requested documentation were similar to the present study and included a symptoms rating scale, severity of symptoms and adverse effects, and the clinician's global impression. The notable difference in implementation procedures was that the medical staff had incorporated a uniform clinical report form as a component of usual care. Since their physicians utilized a uniform chart documentation form, a set of "rules" could be used to calculate adherence. Although documentation was inconsistent for all outcome measures, the form was used for all patients. Clinician adherence to the algorithm was relatively high, with overall adherence rates in the 70% range, with the exception of one physician who demonstrated very low adherence. This particular physician was hired after the medical staff had received training in the algorithm and the documentation process. These preliminary findings suggest that a medication algorithm can be implemented in the public mental health sector and that the use of a uniform documentation form is a key component of implementation.

Dennehy et al.[18] evaluated physician adherence to TMAP's algorithms for bipolar disorder. This analysis examined algorithm adherence during the actual TMAP comparative evaluation study,[10] thus implementation was much more intensive (e.g., clinical coordinator, ongoing technical support and feedback) than in either of the TIMA real-life studies. Standardized clinical record forms and computerized objective rules were used to evaluate adherence. Although adherence rates over one year of outpatient care for the various domains of practice behavior were generally high, they were still variable, with the greatest adherence (97%) in using algorithm-recommended medications and the least (55%) in scheduling visits for a symptomatic patient. The authors stated that medication dosing within the recommended range was the strongest adherence variable and suggested that clinical record forms and computerized rules are a valid measure of adherence to the bipolar disorder algorithm. Furthermore, their preliminary conclusion was that adherence predicts clinical outcomes with about 44% accuracy. One might expect that the longer study duration, one year in the Dennehy et al. study and eight months in the Bettinger et al. study, would actually lead to lower adherence rates compared with this short-term study.

In addition to providing better health care services, implementing evidence-based practices can be a means of achieving accountability for decisions made.[19] Recent research has emphasized using formal assessment tools that quantitatively monitor fidelity with an implemented treatment model.[20,21] This suggests that the use of traditional progress notes cannot accurately assess clinicians' adherence and fidelity to a treatment model.

The results of this study show that education alone is not effective in changing physician behavior and that psychiatrist documentation with traditional progress notes provides an inadequate description of patient care and outcomes. Furthermore, the inadequate and inconsistent documentation makes it difficult to assess true adherence to the algorithm.

Research indicates that, in mental health programs, the quality of implementation appears directly related to practice changes and outcomes.[21] Rubenstein et al.[22] focused on a collaborative approach to the treatment of depression in managed care settings. Primary care expert leaders hired depression nurse specialists, who attended a one-day training session, and psychotherapists, who spent at least two days in training. Both written and videotaped study materials were utilized, and detailed tracking materials for proper patient follow-up were provided to the psychotherapists. Guideline adherence rates for most interventions were above 70% and many were near 100%. When the educational efforts were focused on local primary care providers, adherence rates were 80% with lectures and seminars and 48% with academic detailing. These findings confirmed the idea that multi-faceted educational interventions may be more successful than single interventions.[15] In another study, collaborative patient management by the primary care physician and a consulting psychiatrist, intensive patient education, and the monitoring of continued medication refills improved patient adherence to antidepressant regimens.[5] Also, patient satisfaction and clinical outcomes improved in those with major depression. The investigators also noted that behavioral changes did not endure after the intervention was withdrawn and that ongoing interventions to make care consistent with treatment guidelines are likely needed.

Experts in organizational behavior argue that an emphasis must be placed on changing the organization if effective implementation of a new intervention is to occur in a practice setting.[20] This is emphasized in the Institute of Medicine's report on improving the quality of health care in the United States, which states that the primary problems with health care in the United States lie with dysfunctional organizations and systems of care and not with inadequate providers.[23] It outlines simple rules and challenges for improving the quality of care and states that organizations must make implementing evidencebased practices a priority in care. To accomplish this, the quality of communications must improve among health care providers and between providers and patients. We must also improve our applications of computer technology in documenting patient care processes, monitoring and evaluating care, and establishing efficient communications. Furthermore, the health care industry must become more serious about continuous quality management as a routine practice to enhance the quality of care provided.

A treatment algorithm is a product and in itself does nothing. A plan for implementation and organizational change must be created that provides a mechanism for the treatment algorithm to actually be implemented within the organization. Along with implementation, a system of accountability must be established that holds both the organization and the providers accountable for implementation of the algorithm. Rosenheck[20] regards this measure of accountability as fidelity and states that measures of fidelity of care must be examined at both the provider and the organizational levels. He states that champions for the implementation process must be present within the organization, and they must involve all stakeholders in ensuring that implementation occurs. These champions must assist members of the organization in recognizing that the product and processes to be implemented are consistent with the organization's values or goals (e.g., improving the quality of care). Finally, the new product and processes must eventually become part of the subculture of the organization to be successful in the long-term.

As TDMHMR has learned from its algorithm implementation attempts, it is seeking to apply these principles to application. Collaboration with organizational behavior experts has occurred and planning has begun for the development and implementation of a framework for change. Instruments are being created to assess fidelity to the algorithm at both the organizational and provider levels. Perhaps by applying these principles to algorithm implementation, future studies will show that the degree of adherence has increased.

Physician and staff education alone did not significantly alter providers' practice behaviors. Inadequate and inconsistent documentation of clinical outcomes made it difficult to assess physician adherence to the treatment algorithm.

Presented at the ASHP Midyear Clinical Meeting, Atlanta, GA, December 10, 2002.

Comments

3090D553-9492-4563-8681-AD288FA52ACE
Comments on Medscape are moderated and should be professional in tone and on topic. You must declare any conflicts of interest related to your comments and responses. Please see our Commenting Guide for further information. We reserve the right to remove posts at our sole discretion.
Post as:

processing....