Heart Attack Deaths at ‘America’s Best Hospitals'

Jul 11, 2007
Heart Attack Death Rates appear lower at ‘America’s Best
Hospitals’

Individuals admitted for heart attack to a hospital ranked
as one of “America’s Best” by U.S. News & World Report are
less likely to die within 30 days than those admitted to a
non-ranked hospital, according to a report in the July 9
issue of Archives of Internal Medicine, one of the
JAMA/Archives journals1. Using a methodology that is
similar to the recently released mortality measures that
are publicly reported by the Centers for Medicare and
Medicaid Services (CMS), the study found that ranked
hospitals were also more likely to have lower-than-expected
death rates—however, many unranked hospitals did as well.

“Among the increasing number of academic, industry and
governmental profiling systems that evaluate and compare
hospitals, U.S. News & World Report’s annual issue of
‘America’s Best Hospitals’ for specialty and overall care
is one of the most well known,” the authors write as
background information in the article. “Despite their
prominent role in the public arena, the ability of the U.S.
News & World Report rankings to identify hospitals with
excellent survival rates for common cardiovascular
conditions is not known.”

Oliver J. Wang, M.D., of Yale University School of
Medicine, New Haven, Conn., and colleagues assessed 30-day
death rates among 13,662 patients admitted to 50 hospitals
ranked on the U.S. News list as the best in “Heart and
Heart Surgery” and among 254,907 patients admitted to 3,813
unranked hospitals in 2003. The researchers also compared
the hospitals’ standardized mortality ratios, where a ratio
of greater than one indicates that the hospital had more
deaths than expected and a ratio of less than one means
there were fewer deaths than expected.

After the researchers factored in patient characteristics,
the 30-day death rates were, on average, lower in ranked
hospitals vs. non-ranked hospitals (16 percent vs. 17.9
percent). When the hospitals were divided into four groups
based on these rates, 35 ranked hospitals (70 percent) were
in the group with the fewest deaths, 11 (22 percent) were
in the middle two groups and four (8 percent) were in the
worst performing group.

Eleven ranked hospitals (22 percent) and 28 non-ranked
hospitals (0.73 percent) had standardized mortality ratios
significantly less than one, meaning that although ranked
hospitals were more likely to have lower-than-expected
death rates, non-ranked hospitals with favorable ratios
outnumbered ranked hospitals with similar performance by
nearly three to one. “As a result, the U.S. News & World
Report ranking list does not include many hospitals that
have outstanding performances for the care of patients with
acute myocardial infarction,” or heart attack, the authors
write.

One reason for this may be the reputation component of the
rankings, which accounts for one-third of the overall
ranking score and is based on cardiologists’ opinions of
hospitals that provide the best treatment, the authors
speculate. “Citations by cardiologists likely favor
tertiary centers with strong subspecialty care for the most
critically ill patients while not necessarily reflecting
the perceived care for the overwhelming majority of
admissions for more common diagnoses, which in turn have a
more substantial impact on overall hospital outcomes,” they
continue.

“The U.S. News & World Report ranking, which includes many
of the nation’s most prestigious hospitals, did identify a
group of hospitals that was much more likely than
non-ranked hospitals to have superb performance on 30-day
mortality after acute myocardial infarction,” the authors
conclude. “However, our study also revealed that not all
ranked hospitals had outstanding performance and that many
non-ranked hospitals performed well. Consequently, although
the U.S. News & World Report rankings provide some guidance
about the performance on outcomes, they fall short of
identifying all the top hospitals with respect to 30-day
survival after admission for acute myocardial infarction
and include a few hospitals that are actually in the lowest
quartile of performance.”

Sean Michael O’Brien, Ph.D., and Eric D. Peterson, M.D., of
Duke University, Durham, N.C, notes, in an editorial
published in the same edition of the journal that although
hospital rankings are now published by a wide variety of
governmental and non-governmental organizations, it is
unclear how useful they are to patients2.

“A growing literature of methodological studies presents a
sobering picture for patients who would like to use
available quality information to identify hospitals with
the best outcomes for a particular condition,” they write.
“Most systems seem to do a reasonable job at identifying
groups of hospitals that perform well on average, yet there
is considerable uncertainty regarding the true performance
of a particular hospital. As noted, some truly exceptional
hospitals will be improperly rated as poor whereas some
mediocre hospitals will be rated as excellent.”

However, that does not mean that assessing hospital quality
has no role in medicine, they write. Hospitals ranked
poorly should take action, and those ranked highly should
not boast or become complacent. “They need to understand
the potential inconsistency and fallibility of
quality-ranking systems. Moreover, they need to realize
that regardless of their true rank, their goal should not
be to merely beat their peers in the ratings but to strive
for optimum performance. In this type of quality
competition, the real winners are the patients,” Drs.
O’Brien and Peterson conclude.

References:
1. Arch Intern Med. 2007; 167(13):1345-1351.
2. Arch Intern Med. 2007; 167(13):1342-1344.