Joint Commission 'Best' List Draws Skepticism from C-Suite

Cheryl Clark, for HealthLeaders Media , September 22, 2011

The Joint Commission last week touted its first issue of a list of the nation's "best" 405 hospitals, identified as those that achieved top scores in a composite of process measures.

But based on some conversations I've had with a few healthcare executives, the TJC project has left many scratching their heads.

During a news briefing to launch the report, commission president Mark Chassin, MD, said he had high confidence in the measures "because they pass a very rigorous set of tests that when hospitals improve on these measures, outcomes for patients get better directly because of that work."

Those hospitals that made the list checked all the right boxes indicating the correct procedures and processes were performed 95% of the time for each and every appropriate patient in 2010.  That's a tall order.

The Joint Commission is taking these metrics so seriously that hospitals falling below 85% on their composite scores will have to raise them. 

Beginning January 1, 2012 organizations that are cited for compliance scores below 85% will have a period of time to come into compliance before their accreditation would be at risk, according to a Joint Commission spokeswoman.

Here's the rub: some health system officials whose hospitals made the list say they're questioning its worth, in part because they aren't totally on board with these process measures, and in part because they share the list with hospitals not known for high quality performance in their respective communities.  

And hospitals with national reputations for quality, as measured by Thomson Reuters or U.S. News & World Report – such as Johns Hopkins University, Stanford University Medical Center, Mayo Clinic, Cleveland Clinic, Geisinger Medical Center, Duke University Medical Center, Massachusetts General Hospital – are not on the list.

1 | 2 | 3 | 4

Comments are moderated. Please be patient.

4 comments on "Joint Commission 'Best' List Draws Skepticism from C-Suite"

C.L.Jones (9/23/2011 at 9:46 AM)
There are many elements of disconnect here. First-it's like comparing apples to oranges when comparing these two unique and different rankings and trying to make the same conclusion. The JC is trying to use bsic and core measures to rank basic standards of care. USNWR is a publication,informative however, and not a peer reviewed medical science based journal. There are some interesting survey and measure techinque in the USNWR methodology- that fortunately have improved over the years- but have a lot of opinion based information, from research companies owned by physicians of these major "report" headliners. Bottom line- consumer beware.

Daniel Fell (9/22/2011 at 11:40 PM)
Sadly, it's the patient attempting to make informed decisions about his or her healthcare who is faced with how best to interpret another set of conflicting quality measures. While the lack of standards surely helps some hospitals to compete in the marketplace, long-term it continues to erode consumer confidence and trust. The industry doesn't need more healthcare ratings, rankings and awards - it needs more consensus on which ones matter.

mila michaels (9/22/2011 at 3:29 PM)
Not surprising C Van Gorder is perplexed. Scripps boasts the highest number of fines in San Diego county by the state licensing board. TJC has again proven that a true and unbiased rating shouldn't be bought.




FREE e-Newsletters Join the Council Subscribe to HL magazine


100 Winners Circle Suite 300
Brentwood, TN 37027


About | Advertise | Terms of Use | Privacy Policy | Reprints/Permissions | Contact
© HealthLeaders Media 2016 a division of BLR All rights reserved.