Quality and Accountability

How did we do it?

Banner

Commitment

In 2007, Lorris Betz, M.D., Ph.D., senior vice president for health sciences, launched a new initiative with a clear directive: make the patient experience exceptional. He held retreats with hospital and physician leadership to ask what this would entail. It turned out it meant everything from the smile when patients walk through our doors to a positive outcome for their surgeries. Quality was an integral component of his call to action. As a result, our leadership, doctors, nurses, and staff took on quality as a top priority. Attention was brought to bear on everything from the first phone call to the last appointment, from root cause analysis and data mining to changing process and clinical practice. The organization was mobilized.

Our Strategy

The hospitals and physician leadership examined the dozens of quality measures available across the nation. It became evident that the Consortium’s measure was the most rigorous and most appropriate measure of our system’s quality. The Quality and Safety Department, directed by Carol Hadlock, set out to better understand the measure. “The Consortium’s mathematical model for determining quality in academic health systems is extremely complex, and they don’t share it. So we decided to build our own to match theirs. Then we knew exactly what cases and what clinical situations contributed to specific quality measures.” Once her team began to mine the data, Hadlock connected with physician leaders to begin the process of reviewing patient cases one at a time.

This exhaustive review of patient records revealed a number of issues, including problems with the documentation that care providers used when describing their patients’ conditions. Providers’ documentation drives what is “coded” into a patient’s record. When certain items were missing or incorrect in the documentation, the patient record was mis-coded, often to the detriment of our quality score.

This was welcome news to some, such as Stefan Pulst, M.D., chair of neurology. “I was very surprised when I came here in 2007 from Cedars-Sinai in Los Angeles that the neurology department’s reported mortality was 1.8 times the national average. I knew the physicians, the range of services, and the excellent level of care, and something didn’t add up.” Pulst and William Couldwell, M.D., Ph.D., chair of neurosurgery, investigated their cases and discovered providers were not documenting the true condition of patients who were transferring into our system. Since we have a neurocritical care unit and a regional stroke center and offer services other systems do not, we have a larger portion of critically ill patients. Quality measures such as the Consortium’s take into account the acuity of a patient’s condition when they determine mortality scores. You can imagine that a severe head trauma case with a poor prognosis has a much different mortality index than an emergency department patient with a broken finger. “When we appropriately documented and coded our cases, our mortality score dropped to around half of the national average,” says Pulst. This score more closely matched the level of care he knew was happening at the U for years. Importantly, now that the mortality measure was reporting correctly, they could focus on fine-tuning other aspects of quality and patient safety. Another important benefit of reviewing patient cases one at a time was that quality specialists and physicians recognized opportunities for improving clinical practice. Each case yielded more information to hone our protocols and refine our best practices.

Focused Execution

Armed with data and a strategy, teams set to work. Nurses diligently educated patients about smoking cessation, a core quality measure. Because the correct selection and timing of antibiotics is critical to improving surgical outcomes, all physicians involved in surgical cases were brought up to speed on best practices. What’s more, specific protocols and order sets were built into the computerized provider order entry system as a check-and-balance measure to ensure consistent care. Examples like these began to happen across the system.

In some areas, the collaboration between quality specialists and physician leadership was exemplary. Matthew Peterson, M.D., chair of obstetrics and gynecology, explained how his department was so committed to quality improvement that it funded half the salary of a quality specialist so providers could receive real-time data. “John Arego has been amazing to work with—he provided us with weekly reports and identified cases where there was room for improvement,” he says. Peterson initiated a thorough review of the literature around the country to stay up-to-date on best practices and then discussed clinical practice at his faculty meetings. After every quality event that was flagged, Peterson personally sat down with the physician involved in the case and talked through what should be done differently in the future. “It was a matter of accountability,” he said, “And the physicians in our department were happy to step up to the challenge.”

An honor earned by everyone

While we performed focused work in dozens of areas, the entire organization stepped up its attention to the patient experience and quality outcomes. The Consortium’s study examines 10 questions from the Hospital Consumer Assessment of Healthcare Providers & Systems (HCAHPS) patient survey, developed by the Centers for Medicare and Medicaid Services and anonymously mailed to patient’s homes. In addition to questions about pain management and medications, these questions probed the cleanliness and quietness of the hospital, the responsiveness of staff, and the overall experience. Everyone from housekeeping and nutrition care to billing and check-in staff contributed to making every aspect of the patient experience exceptional. In the process, we learned that quality includes medical knowledge, technical excellence, compliance with evidence-based processes and the totality of the patient’s experience.