Hospital Reporting Unrelated to Better Patient Outcomes

Tara Haelle

October 16, 2018

SAN DIEGO — No correlation exists between mortality rates and process measures reported to the Centers for Medicare and Medicaid Services (CMS) Hospital Compare program, new research shows.

"We know that reporting comes with substantial costs to hospitals, so it's questionable whether the cost and effort associated with these reporting measures is justified," said Laura Burke, MD, from the Beth Israel Deaconess Medical Center in Boston.

"We spend a lot of money on care in the United States, but our outcomes are not always optimal," she told attendees here at the American College of Emergency Physicians 2018 Scientific Assembly.

There's broad consensus that measuring quality is important, but how best to do so is still controversial.

"There's broad consensus that measuring quality is important, but how best to do so is still controversial," she added. Outcomes are what matter most to patients, but they are hard to measure, particularly because they require risk adjustment.

In many medical specialties, data "suggest that the way hospitals perform on process measures isn't associated necessarily with better outcomes," said Burke.

To see whether performance in emergency medicine is associated with patient outcomes, Burke and her colleagues compared process measures for acute myocardial infarction with patient mortality rates.

The quality measures that hospitals must report to CMS for acute myocardial infarction include median time to ECG, aspirin at arrival, percutaneous coronary intervention within 90 minutes of hospital arrival, and median time to transfer to another hospital for percutaneous coronary intervention.

The researchers evaluated each of these four measures at 4481 hospitals using national Medicare data from the Hospital Compare program. The study cohort consisted of 421,795 Medicare beneficiaries, 65 years and older, who were admitted to the hospital for acute myocardial infarction from 2013 to 2015.

The team assessed mortality 3, 7, 14, and 30 days after admission, adjusted for patient age, sex, and chronic conditions.

Time to ECG was reported by 54.4% of hospitals, aspirin at arrival by 54.1%, percutaneous coronary intervention within 90 minutes by 33.9%, and transfer times for percutaneous coronary intervention by 13.8%.

However, even when the process measures themselves varied, mortality rates remained similar in all quartiles.

Table. Outcomes by Quality Quartile for Acute Myocardial Infarction
Quartile Performance Mortality Rate, %
Average time to ECG    
Lowest 25% 15.1 min 14.6
Middle 50% 7.7 min 14.5
Highest 25% 3.7 min 14.8
Aspirin at arrival    
Lowest 25% 89.8% 14.3
Middle 50% 97.7% 14.9
Highest 25% 100.0% 14.6
PCI within 90 minutes    
Lowest 25% 88.4% 12.8
Middle 50% 97.2% 13.0
Highest 25% 100.0% 12.8
Time to transfer for PCI    
Lowest 25% 101.6 min 16.6
Middle 50% 55.1 min 16.2
Highest 25% 35.5 min 15.6

"When comparing risk-adjusted mortality across these groups, we found no significant association between hospital performance on process measures for AMI and patient-level AMI mortality at any time point for any of the measures," Burke reported. "We found no evidence that hospitals performing better on most publicly reported ED process measures had better outcomes for patients with AMI."

Using mortality is a "blunt instrument," she acknowledged, adding that the team is doing research on outcomes such as readmission, healthy days at home, and cost.

I think the bigger take-home is that there wasn't a high correlation between requiring reporting and people actually doing it.

The actual reporting rates are as concerning as the findings themselves, said Greg McKelvey, MD, PhD, from KenSci in Seattle.

"I think the bigger take-home is that there wasn't a high correlation between requiring reporting and people actually doing it," he told Medscape Medical News. "Compliance was really low across the board."

Despite the desire to incentivize good behavior, mandatory reporting instead appears to be a burden on emergency department physicians that is "failing on multiple fronts," he said.

A solution could be to move "from a push to a pull model, so it's not the providers' responsibility" to report what they're doing. "It needs to be collected and provided back to them as feedback," McKelvey said.

"Unfortunately, these quality measures are imposed on institutions, and a lot of work needs to be done to mine those data and provide them back to CMS," said Paul Casey, MD, from Rush University Medical Center in Elmhurst, Illinois.

"Ultimately, if it's not really correlated to improved outcomes, it's a lot of work on the part of the hospital and providers that doesn't necessarily improve patient care or patient outcomes," he told Medscape Medical News.

No external funding for this research was reported. Burke, McKelvey, and Casey have disclosed no relevant financial relationships.

American College of Emergency Physician (ACEP) 2018 Scientific Assembly: Abstract 13EMF. Presented October 3, 2018.

Follow Medscape on Twitter @Medscape and Tara Haelle @TaraHaelle


Comments on Medscape are moderated and should be professional in tone and on topic. You must declare any conflicts of interest related to your comments and responses. Please see our Commenting Guide for further information. We reserve the right to remove posts at our sole discretion.