If you asked 50 people what’s meant by evidence-based medicine chances are good you’d get a number of definitions. I say “chances are” because I’m basing that statement on my own experience as a physician who gets lots of questions from colleagues and friends, though I haven’t referenced a specific study in which data was collected from a significantly large, demographically-diverse group of people and analyzed over an appropriate length of time. In other words, I’ve given you an informed opinion, but not evidence.
It’s not surprising we sometimes get raised eyebrows over the phrase itself. People may well wonder: Isn’t saying evidence-based medicine redundant, like saying wet water? After all, medicine is a science and science is based on evidence. This trusting view of medical care has been borne out in many studies that show people evaluate quality in hospitals based on things like a physician’s bedside manner, call-light response times, the friendliness of the staff, the age of the facility, or the taste of the food, rather than on clinical outcomes. In fact, one 1997 study done by the Institute for Quality Center in Hudson, Ohio showed that “patients use more than 500 criteria to evaluate a hospital’s quality” and often use “emotional criteria to evaluate technical quality.” That’s a good reminder not to neglect the human side of healthcare and the part it plays in healing.
Still, when we’re talking about evidence-based medicine at Intermountain Healthcare, we’re talking about gathering specific, unbiased data and fearlessly looking at what it tells us. I say fearlessly, because we know from our own experience that there are times the data challenges our assumptions, our biases, our memories, and the opinions of experts we respect. In fact, medicine has a history (especially the farther back you go) of being expert-based rather than evidence-based. What’s the difference between the two? I’d sum it up this way: If, as a physician, you rely on solely on what you’ve been taught, or your memory of experiences with your patients, and you resist what data reveals, you may be missing some crucial information that might alter your practice in ways that improve outcomes for your patients. Of course, most physicians with expertise do seek the information derived from data and studies that have been conducted well and are meticulously reviewed by peers. But there is still the dilemma of getting the information out there, and supporting the implementation of new processes.
The example we often use at Intermountain is an important one, and one that’s fairly easy to understand. Over the past decades, many expectant mothers, with the support of their doctors, were choosing to have their deliveries induced prior to 39 weeks gestation, simply for comfort or convenience. Meanwhile, studies were telling us that inducing labor before 39 weeks led to more problems for the newborns, including that more of them required ventilators. Many physicians, relying on memory, didn’t think it had been a problem for their patients—but when we showed them the data gathered on thousands of deliveries in our own Intermountain hospitals, they recognized that early inductions are clearly associated with increased risk to baby and mother. At Intermountain, we were able to support this “best practice” of only inducing labor before 39 weeks if there was a medical need, we were able to share the best practice with our physicians, and we reduced the number of early elective inductions. The result was healthier newborns, fewer C-Sections, and lower overall medical expense in our communities—which was rewarding for both families and caregivers.
Our emphasis on evidence-based medicine one of the things that sets Intermountain apart. The teams in our Clinical Programs continually review the medical literature, specialty society guidelines, and our own objective experience in order to define best practices. They then focus on how to share and implement those best practices. And clinicians will be pleased to know that our administrative, finance, and other support teams are learning from their success with best practices. For the past six years, we’ve been seeking and implementing evidence-based operational best practices in addition to our clinical efforts, improving the efficiency and efficacy of what we do.
It’s not surprising we sometimes get raised eyebrows over the phrase itself. People may well wonder: Isn’t saying evidence-based medicine redundant, like saying wet water? After all, medicine is a science and science is based on evidence. This trusting view of medical care has been borne out in many studies that show people evaluate quality in hospitals based on things like a physician’s bedside manner, call-light response times, the friendliness of the staff, the age of the facility, or the taste of the food, rather than on clinical outcomes. In fact, one 1997 study done by the Institute for Quality Center in Hudson, Ohio showed that “patients use more than 500 criteria to evaluate a hospital’s quality” and often use “emotional criteria to evaluate technical quality.” That’s a good reminder not to neglect the human side of healthcare and the part it plays in healing.
Still, when we’re talking about evidence-based medicine at Intermountain Healthcare, we’re talking about gathering specific, unbiased data and fearlessly looking at what it tells us. I say fearlessly, because we know from our own experience that there are times the data challenges our assumptions, our biases, our memories, and the opinions of experts we respect. In fact, medicine has a history (especially the farther back you go) of being expert-based rather than evidence-based. What’s the difference between the two? I’d sum it up this way: If, as a physician, you rely on solely on what you’ve been taught, or your memory of experiences with your patients, and you resist what data reveals, you may be missing some crucial information that might alter your practice in ways that improve outcomes for your patients. Of course, most physicians with expertise do seek the information derived from data and studies that have been conducted well and are meticulously reviewed by peers. But there is still the dilemma of getting the information out there, and supporting the implementation of new processes.
The example we often use at Intermountain is an important one, and one that’s fairly easy to understand. Over the past decades, many expectant mothers, with the support of their doctors, were choosing to have their deliveries induced prior to 39 weeks gestation, simply for comfort or convenience. Meanwhile, studies were telling us that inducing labor before 39 weeks led to more problems for the newborns, including that more of them required ventilators. Many physicians, relying on memory, didn’t think it had been a problem for their patients—but when we showed them the data gathered on thousands of deliveries in our own Intermountain hospitals, they recognized that early inductions are clearly associated with increased risk to baby and mother. At Intermountain, we were able to support this “best practice” of only inducing labor before 39 weeks if there was a medical need, we were able to share the best practice with our physicians, and we reduced the number of early elective inductions. The result was healthier newborns, fewer C-Sections, and lower overall medical expense in our communities—which was rewarding for both families and caregivers.
Our emphasis on evidence-based medicine one of the things that sets Intermountain apart. The teams in our Clinical Programs continually review the medical literature, specialty society guidelines, and our own objective experience in order to define best practices. They then focus on how to share and implement those best practices. And clinicians will be pleased to know that our administrative, finance, and other support teams are learning from their success with best practices. For the past six years, we’ve been seeking and implementing evidence-based operational best practices in addition to our clinical efforts, improving the efficiency and efficacy of what we do.