Joint Commission to Suspend Hospital Ratings in 2016

November 25, 2015

The Joint Commission (JC) is taking a breather from the topsy-turvy world of top hospital ratings next year in hopes of getting its bearings.

The nation's main accreditation agency for hospitals is one of several groups that annually rate and rank these institutions, using a variety of performance yardsticks, some more scientifically respected than others. Since 2010, the JC has singled out what it calls Top Performer hospitals for scoring consistently high on measures of evidence-based care. They now cover 49 clinical processes such as performing fibrinolytic therapy within 30 minutes for patients with heart attacks, screening for tobacco use, immunizing against influenza, and stopping antibiotics within 24 hours of surgery. The JC describes them as closely linked to good patient outcomes.

Last week, the JC published its list of 1043 hospitals that attained Top Performer status because of their performance in 2014. At the same time, it announced it would not issue such a list in 2016 regarding 2015 performance. It intends to use the hiatus to rethink how it should evaluate hospitals in the "evolving national measuring environment."

The problem is, the JC does not like how hospital quality metrics are evolving, especially on the federal side. There was a time, however, when the JC and the federal government were in sync, said Mark Chassin, MD, MPH, the group's president and chief executive officer, in an interview with Medscape Medical News.

Dr Chassin explained that in the mid-2000s, the Centers for Medicare & Medicaid Services (CMS) and its Hospital Compare program began requiring facilities to report their performance on quality measures the JC had introduced just a few years earlier. "For the first 10 years or so, CMS measures and ours were identical," he said. And that was a good thing, because the JC didn't want to burden hospitals with collecting two different sets of performance data.

"What's happened in the last few years is that CMS has added measures that we don't have in common," said Dr Chassin. Some these measures are based on data culled from Medicare claims, which the JC and others consider a poor gauge of clinical performance because they do not accurately describe severity of illness and complications.

Human Eyes Beat a Computer Search

CMS also has increasingly retired so-called chart-based measures, which are favored by the JC, that look at data manually abstracted from the patient's electronic or paper record. In some cases, CMS has deemed them no longer needed because hospitals uniformly post such high scores on them — close to 100% — as to make comparisons irrelevant. The JC, however, believes in preserving these "topped-out" measures to keep hospitals on their toes.

The government's Hospital Compare program is pushing facilities to replace other chart-based measures with "electronic clinical quality measures" (ECQMs), which are based on computerized searches for discrete data in electronic health records. ECQMs and chart-based measures may seem identical, but there are subtle differences that may make the former less trustworthy, said Dr Chassin. Take an ECQM that checks to see whether patients with heart attacks were discharged with a recommended beta blocker prescription. What about a patient who does not receive the drug because of a contraindication? That information usually is in the unstructured narrative section of the electronic health record, which human eyes might catch, but an ECQM and its fixation on data fields would not. In the latter case, a patient who should be excluded from the measurement would be counted.

Compounding that problem is wide variation in electronic health record design and where different systems squirrel away data, such as patient allergies. Labor-intensive chart-based measures would pull out the needed details, wherever they are.

Dr Chassin said he does not oppose ECQMs per se. He just wants to see studies proving they are just as accurate as chart-based measures, which clinicians have come to trust.

Faced with asking hospitals to collect and submit data on chart-based measures CMS is no longer requiring, the JC decided to ease their workload in 2015 and give them some flexibility on what measures they can report. As a consequence, "most hospitals have decided not to send us a large volume of chart-based measures," said Dr Chassin. "We won't have enough measures to do the Top Performer program the same way we've done it in the last 5 years."

Rather than disappearing, the Top Performer program "will likely take a different form," as determined during the hiatus year of 2016, he said. "We do want to recognize hospitals that are taking great strides in improvement." The program, he added, has been a "counterweight" to other hospital rating approaches that "use bad, invalid data on quality;" for example, US News & World Report magazine factors in a hospital's reputation among clinicians to produce its annual ranking.

"Where Is the Added Value?"

One quality improvement consultant said the JC faces hard choices as it tries to keep up with changes in performance measures.

Michael Millenson, president of Health Quality Advisors and author of Demanding Medical Excellence: Doctors and Accountability in the Information Age, said that in principle, it is a good idea for the JC to align its metrics with CMS to avoid burdening hospitals with extra data collection. That said, "how does the [JC] collecting the same measures as the government help hospitals?" Millenson said to Medscape Medical News.

"Why not just rely on Hospital Compare? Where is the added value [of the JC] reporting this stuff?" he said. He suspects it has more to do with hospitals wanting to brand themselves as Top Performers than educating the public. Top Performer data, he said, are user-unfriendly for consumers.

In contrast, if the JC objects to the ECQMs used by CMS, and the government will not change its methodology, what can the JC do? "Maybe they should have fewer measures and make them more usable [for consumers]," he said.

The JC, he noted, has a legitimate point in questioning the CMS practice of retiring topped out measures that almost every hospital is acing. "It could be that if you stop measuring, you'll slide backwards."

Number of Top Performers Declined 15%

The 1043 hospitals named as Top Performers represent nearly 32% of all hospitals accredited by the JC that reported performance data for 2014. The 49 quality measures fell into 12 sets of categories: heart attack, heart failure, pneumonia, perinatal care, surgical care, children's asthma care, inpatient psychiatric services, venous thromboembolism care, stroke care, immunization, tobacco treatment, and substance abuse. The last two were new for 2014.

To earn Top Performer status, a hospital had to submit 2014 data in at least six of the 12 categories, up from four in the previous year. That change, plus the debut of two more measure categories (tobacco treatment and substance abuse) and an additional measure in the inpatient psychiatry set, raised the bar last year, explaining why the number of Top Performers fell almost 15% from 1224 hospitals in performance year 2013, according to Dr Chassin.

More information about the top-performing hospitals named by the Joint Commission is available on the group's website.

Comments

3090D553-9492-4563-8681-AD288FA52ACE
Comments on Medscape are moderated and should be professional in tone and on topic. You must declare any conflicts of interest related to your comments and responses. Please see our Commenting Guide for further information. We reserve the right to remove posts at our sole discretion.
Post as:

processing....