The pervasiveness of real-world data (RWD) and the need to understand outcomes in clinical practice have galvanized the ubiquity and utility of real-world evidence (RWE). Especially within the life sciences industry, RWE is increasingly leveraged by research and development, epidemiology, outcomes research, and commercial groups across an asset’s lifecycle to strengthen product evidence and value messages. RWE studies are conducted to strategically align with corresponding stages of product development, and include evaluations of the natural history of disease, treatment pathways, characteristics of the target population, drug utilization and effectiveness, and drug safety.
Despite the prevalence of RWD, and by extension RWE, it’s important to also recognize some of the research challenges associated with RWE. Identifying an appropriate database that contains the right data elements is a critical first step. Equally as important is defining the study methodology, which includes identifying the correct population, measures, time periods, and statistical techniques for implementation. Any misstep in the process may risk the appropriateness of RWE outputs and their interpretations.
The Institute for Clinical and Economic Review (ICER) recently released its 2020-2023 Value Assessment Framework affirming the organization’s commitment to the development and integration of RWE into its economic modeling exercises. In particular, ICER has committed to the use of RWE in 2 applications:
While we applaud this pledge, we identified 5 areas for which ICER should provide greater transparency regarding its intentions, policies, and procedures when generating and applying RWE to its assessments.
In recently-released documents, ICER provided a glimpse of their protocol development process indicating that a priori protocols will be developed prior to RWE evaluations. While protocols are a necessary first step in the RWE generation process, we have the following questions:
Which organization(s) will be responsible for designing the RWE study methodology and drafting the protocol? Will ICER assume this responsibility? An outside group? What are the credentials and experiences of the responsible parties?
Additionally, ICER recently provided limited information about its protocol development process: “The protocols developed will be included in the assessments that ICER releases to the public.” ICER publicly disseminates several documents throughout its evaluation timeline, including methods documents prior to an evaluation (i.e., draft and revised scoping documents, clinical research protocol, model analysis plan) as well as results documents following an evaluation (i.e., draft evidence report, revised evidence report, final evidence report and meeting summary). The ambiguity of ICER’s statement is concerning in that the organization fails to specify within which document the protocols will be included. This has important implications for ICER’s timeline and specifically its requests for public comments and critiques. For instance, if ICER includes the protocol in its revised or final evidence reports, doing so would preclude a public comment period as well as a proper amount of time for replication.
Will the public have an opportunity to review and critique ICER’s RWE protocols in the same manner that ICER allows for review and critique of its cost-effectiveness modeling methods? How much time will the public have to implement ICER’s protocol and attempt to replicate its RWE results?
As mentioned above, selecting the right database to answer the research question is of paramount importance. In its Value Assessment Framework, ICER indicated that de novo evidence generation may be undertaken using insurance claims data or patient survey data, which raises a series of questions:
Will ICER consider alternative sources of data such as data from EMRs or registries? There are certain and distinct limitations associated with claims and patient survey data, which may restrict appropriate use in ICER’s assessments. For instance, clinical data may be required to properly estimate health state transitions and intermediate clinical outcomes (e.g., HbA1c readings), both of which are unlikely to be available in claims or survey data. Also, how realistic is de novo data collection from surveys? These efforts are time-consuming and costly, and therefore may not be feasible to administer or apply.
In its Value Assessment Framework, ICER stated that the objective of the 24-month update is to supplement its economic models with “the early impacts of these therapies” using RWE. Questions abound regarding:
How will RWE contribute to ICER’s 24-month reassessment? Will ICER assess an intervention’s comparative effectiveness? What measures will ICER evaluate to estimate effectiveness? Many clinical variables, used to assess real-world clinical effectiveness, are unlikely to be included in insurance claims data (unless otherwise linked to lab values in which case sample sizes may become an issue). Furthermore, health utilities will not be captured in claims data. Will ICER, therefore, have to rely on proxied measures of treatment adherence, ER visits, and hospitalizations, for instance, to estimate treatment benefit?
In its Value Assessment Framework, ICER indicated that de novo evidence generation will be undertaken if “critical data elements” for its economic modeling efforts are lacking. The use of the word “critical” in this context implies that analyses will be conducted for key model drivers, which may substantially sway cost-effectiveness ratios and thus the valuation of an intervention (i.e., low vs. intermediate vs. high value-for-money). If this is the case, it is therefore imperative that researchers validate ICER’s RWE findings.
Who will be responsible for validating ICER’s RWE findings that may have substantial implications for ICER’s cost-effectiveness model results? What methods and tools will ICER use? Will the Institute use alternative data sources (even an alternative claims database) to confirm RWE findings?
ICER, in the latest iteration of its Value Assessment Framework, makes a concerted effort to enhance its cost-effectiveness modeling initiatives through the generation and integration of RWE. While I applaud its efforts to innovate, I believe ICER needs to be careful on its approach or else it risks not only transparency in its process, but also the results, interpretations, and recommendations that emerge from its evaluations.
BHE's Modeling & Strategy Solutions have helped a number of life sciences organizations prepare for and mitigate ICER reviews. Learn more about how parallel modeling provides evidence to influence ICER reviews by checking out our recent white paper.