Skip navigation
In my opinion

Closing the 17-year gap between scientific evidence and patient care

Newer tests and treatments are not always better and too much care can be bad for your health.

BY DANIEL NIVEN | JAN 17 2017

Not many patients would be happy to hear that there’s a lag of about 17 years between when health scientists learn something significant from rigorous research and when health practitioners change their patient care as a result, but that’s what a now famous study from the Institute of Medicine (now known as the National Academy of Medicine) uncovered in 2001.

The study reflects a major problem that has plagued health care systems for decades – namely, the timely integration of high quality scientific evidence into daily patient care practices.

If you knew there was research available to guide the care you required, wouldn’t you want your health care provider and the system to use that research to inform decisions pertaining to your care? Wouldn’t you want to receive care that is scientifically proven to be of benefit and not receive care that is scientifically proven to be of no benefit?

Although it has been clear for centuries that science contributes to advancing the practice of medicine and improving disease-specific survival rates (for example, the discovery of penicillin and its effect on infection-related mortality rates), this concept only became popularized within the medical community toward the last quarter of the 20th Century through the “evidence-based medicine” movement.

closing the gap between research and practice

More recently, those who work in the field of “Knowledge Translation” have been working hard to close the gap between research and practice. For the most part, they’ve done this successfully by making the abundant research findings more accessible to policy makers, professional societies and practitioners as well as “nudging” these parties to adopt more timely evidence-based practices.

Their methods have largely focused on the adoption of new beneficial practices – new drugs, tests or interventions with substantial evidence behind them. But a pattern has lately emerged from the scientific literature: new is not always better and too much health care can sometimes be bad for you.

Owing to the recognition that unnecessary practices may negatively affect patient outcomes – and contribute to burgeoning costs within health care systems – there is now a movement afoot to promote the discontinuation of practices currently used in daily patient care that research finds to be of no benefit or potentially harmful, collectively referred to as unnecessary practices. Initiatives such as the Choosing Wisely campaign, the Less is More and Reducing Research Waste initiatives have sprung from medical professional societies and high-ranking medical journals to help reduce the practice of “too much” health care.

For example, it turns out that cervical cancer screening in women under age 30 years is not beneficial and may cause unnecessary follow up testing; the use of bone cement to treat painful spine fractures among patients with osteoporosis does not improve pain any more than usual care, and placement of stents in the coronary arteries of patients with narrowed arteries but minimal symptoms is no better than treatment with medications alone.

Other examples include reducing the use of a sophisticated monitoring device (pulmonary artery catheter) to obtain frequent measures of heart function in patients with heart failure and tightly controlling blood sugar using intravenous insulin in patients admitted to intensive care units.

For each of these examples, new research demonstrates that they do not improve patient outcomes, yet each persists to some degree in current clinical practice.

The 17-year gap between research and practice traditionally refers to the time required to adopt new practices. Unfortunately, new research shows that it may take even longer to de-adopt unnecessary practices. Regardless of the direction of practice change, shortening the gap between research and practice has been a long time coming and can only help improve outcomes for patients and control health care spending.

How do we get there?

Shortening the time gap between research and practice will require an increased understanding of what it takes to implement new research and a reduction in the time new research is reflected in professional guidelines.

Guidelines also need to be less cumbersome and directed more toward use at the point-of-care rather than simply a reference document. Health care systems also need to be engineered so that front line providers have a greater likelihood of providing care that is congruent with current science. This is likely best facilitated through use of comprehensive electronic medical records. Given that many health care systems still employ the traditional paper-based charting and order system, this will require considerable financial commitment.

Moving from research to improved practice more rapidly will also take an engaged group of stakeholders – professional societies, health care providers, patients and their family members, medical administrators and governments – who appreciate the long-term benefit that may be derived from such considerable initial investment of time and money.

A health care system that enables providers to consistently deliver care that aligns with recommended best practice should arguably be one of our national priorities.

Daniel Niven is an expert advisor with EvidenceNetwork.ca, an intensive care physician and assistant professor in the departments of critical care medicine and community health sciences in the Cumming School of Medicine at the University of Calgary.

COMMENTS
Post a comment
University Affairs moderates all comments according to the following guidelines. If approved, comments generally appear within one business day. We may republish particularly insightful remarks in our print edition or elsewhere.

Your email address will not be published. Required fields are marked *

  1. Dr. Anil Rickhi / February 25, 2017 at 11:22

    Hi Dan,

    Thank you for writing this very articulate and informative article. I believe rigorous scientific studies are an essential hallmark towards improving patient care. It has been proven over time that with our advances in technology and science this has had a positive effect on healthcare delivery and has decreased morbidity and mortality rates significantly. This is been proven through the use of vaccinations as just one example.

    What is troublesome to me is that there seems to be a perceived decline with respect to looking at evidence-based literature. I have seen this amongst patients who opt out of vaccinations for their children because a celebrity without any medical training says that vaccinations are dangerous. There is no sourced information from this celebrity movement because there is none. Vaccinations safe lives.

    I am curious as to what your opinion is on why the public generally tends to accept non-scientific-based
    medicine promoted by celebrities without any medical training of their physicians?

    Is it a failure on our part as their health care providers? I can tell you from personal experience that when I was working with Doctors Without Borders in a third world nation that the prevalence, incidence and societal acceptance that vaccinations have been scientifically proven were higher then western beliefs.

    In this day and age, I find the trends towards dissmal of scientific evidence in the west disconcerting. However, I am wondering why this is occurring and what we can do as physicians to help educate the public.

    That being said, all patients with exception of certain governed oversights stipulated under the physicians ethical code of conduct, have the fundamental right to make healthcare decisions for themselves. As physicians, I believe we need to provide our patients with more education and be mindful to be conscientious and respectful of their views. Often, opposing views can diminish the physician patient relationship if there is not mutual respect coming from both sides.

    I am curious to see any comments that anyone may have in regards to Dr. Niven’s article.

    Keep up the good work Dan!

    Anil

  2. Samantha Byam / July 22, 2018 at 06:26

    I am currently doing a research module in healthcare. Do you think that the time it takes for evidence to be implemented or de-adopted is partly down to leadership? I wonder if it is partly down to a lack of strong leadership, willing to push evidence forward. Is it a lack of confidence that management will listen?
    Is it also that managers/leaders opposed to change in practice could be because they are not confident themselves to understand and critique the research? It also takes courage to put ones head above the parapet.
    17 years is too long whether implementing or de-adopting. I want to be a leader for change for the better.

Click to fill out a quick survey