Avoiding Methodological Faults in Real-World Evidence

Researchers from Brigham and Women’s Hospital, Harvard Medical Center and NPC found that the potential for biased findings in RWE studies increases substantially in the absence of a rigorous and appropriate methodological approach to study design and analysis. NPC, along with other research organizations, has long worked to identify best practices and standards for collecting and analyzing RWE.

Real-world evidence (RWE) – information collected from everyday health care experiences – helps researchers draw meaningful conclusions about which interventions are the most effective treatment options for specific conditions and patients. RWE can show, for example, how treatments work among similar patients based on their age, demographics and genes; what a disease course might look like in the coming years; or what treatment options might work best in diverse patient populations. Quality RWE could inform treatments and coverage decisions. Faulty evidence, however, could mean that stakeholders make important treatment decisions based on incorrect data.

That’s why it’s important to understand how RWE studies are being designed and conducted so that we can continue to improve on this growing and important area of research. Specifically, are the methods being used for RWE studies rigorous enough, and are they avoiding bias or other errors? Research recently published in Clinical Pharmacology & Therapeutics concluded that the potential for biased findings increases substantially in the absence of a rigorous and appropriate methodological approach to study design and analysis.

This study, conducted by researchers from Brigham and Women’s Hospital, Harvard Medical Center and the National Pharmaceutical Council (NPC), aims to help people recognize methodological flaws ahead of time so they can be avoided while a study is being conducted – not just after a study is completed. Researchers focused their analysis on several pitfalls of RWE, such as timeframes of the study and confounding – when there was no adjustment for age, sex, comorbidities, prior use of medications and if the treatment was not compared to a similar alternative.

Overall, researchers found that nearly 95% of RWE studies examined contained at least one avoidable methodological bias-related error stemming from poor study design and analysis. The most common methodological issue was the potential for time-related bias, which was found in 57% of studies.

NPC, along with other research organizations, has long worked to identify best practices and standards for collecting and analyzing RWE. For example, this paper describes a plan for improving the transparency of the research process and making registration of RWE study methods easier and more routine.

Many health plans and pharmacy and medical directors do not know how to evaluate RWE. To tackle this problem, research organizations have designed tools to help with the evaluation and use of RWE and a set of best practices guidelines explaining what these tools do and how to use them. The guidelines compare the characteristics and features of five tools based on how each one can be used to evaluate the quality and significance of RWE.

Through this study and these tools, we can continue to improve the quality of RWE and its utility for improving health care. Recognizing how to develop quality RWE studies and evaluating existing studies can help strengthen the research and its utility, ultimately increasing confidence in evidence generated from RWE studies. Find out more about this study on the NPC website.