Executive Summary
In comments on FDA guidance on electronic health records and medical claims, Duke-Margolis Center suggests a certification process for validated datasets, while data companies request the agency do more to publicize the experience to date with RWE in regulatory submissions.Source: AlamyTHERE IS A DESIRE NOT TO HAVE TO REINVENT THE PROVERBIAL WHEEL EACH TIME A REAL-WORLD DATA SOURCE IS USED.
A certification process for validated real-world data sources and public disclosure of FDA assessments of real-world evidence in drug applications are urged by stakeholders in the most recent set of comments responding to the agency’s guidance on electronic medical records and health claims.
The comments suggest a common desire to leverage prior validation work and regulatory experience, rather than having to recreate the proverbial wheel each time a dataset is used in the development of RWE to support a regulatory decision.
The comments respond to the agency’s September draft guidance on assessing electronic health records and medical claims to support regulatory decision-making. The document was the first of four RWD draft guidances issued by the agency in the latter half of 2021, with subsequent documents addressing data standards, registries and regulatory considerations. (Also see “Real-World Evidence: ‘Labor-Intensive’ Data Standardization May Be Required By US FDA” – Pink Sheet, 21 Oct, 2021.)
The agency is planning to issue RWE guidances in 2022 on the design and conduct of externally controlled trials, and use of clinical practice data in randomized controlled trials for regulatory decision-making.
The FDA has announced plans for more RWE guidances in the coming year. The Center for Drug Evaluation and Research’s 2022 guidance agenda lists documents on the design and conduct of externally controlled trials, and use of clinical practice data in randomized controlled trials for regulatory decision-making.
In addition, the FDA has committed to establish a pilot program for advancing RWE in the seventh iteration of the Prescription Drug User Fee Act. (Also see “US FDA’s Stein ‘Excited’ About Real-World Evidence, Rare Disease Endpoint Pilot Programs” – Pink Sheet, 14 Sep, 2021.) The program is intended to:
- Identify approaches for generating RWE that meet agency requirements in support of labeling for effectiveness or for meeting post-approval requirements;
- Develop agency processes that promote consistent decision-making and share learnings on RWE; and
- Promote awareness of characteristics of RWE that can support regulatory decisions by allowing FDA to publicly discuss study designs considered in pilot program.
In the draft guidance on electronic medical records and health claims, the agency provided recommendations on selecting data sources to maximize the completeness and accuracy of data derived from EHRs and medical claims for clinical studies. It also addressed issues that are essential to determining the reliability and relevance of the RWD and that should be addressed in an RWE study protocol. (Also see “Real-World Data: US FDA Sets High Bar For Electronic Health Records, Claims In Applications” – Pink Sheet, 29 Sep, 2021.)
In the initial round of comments submitted on the guidance, stakeholders urged the agency to more clearly distinguish between medical claims and electronic health records. They cautioned that it could be difficult to meet the agency’s expectations related to background information about a particular data source, and they requested additional flexibility on validation expectations. (Also see “Real-World Data: FDA Urged To Distinguish Between Electronic Health Records And Medical Claims” – Pink Sheet, 7 Dec, 2021.)
The FDA extended the comment period for the guidance, and several additional comments were submitted by the new deadline of 24 January.
Certification Process For Data Sources
Several stakeholders raised concerns about potentially burdensome validation processes and expressed a desire to learn from the experiences of RWE sponsors who have satisfied the FDA’s strict regulatory requirements.
For data sources that might be used repeatedly, “a special certification process that allows sponsors or data source owners to complete validation and verification processes for key data elements, followed by periodic reviews to ensure data standards are maintained, could be one approach to facilitate an efficient review process,” the Duke-Margolis Center for Health Policy said in comments.
This type of certification process would drive transparency around data quality and offer opportunities to reference prior validated work, while also creating or maintaining data processing efficiencies for evidence generation within critical timeframes, the comments state.
Duke-Margolis previously has floated the idea of a voluntary precertification process that could help ensure the quality and integrity of databases from which RWE is derived. However, this proposal also has triggered concerns about the cost burden for data curators, and questions about who would perform the accreditation and under what criteria. (Also see “Real-World Data: Precertification Could Aid Use For Regulatory Decisions” – Pink Sheet, 17 Oct, 2018.)
The health policy center also raises concerns about the difficulty of identifying ways to validate a variety of variables across RWD sources for different disease use cases. Consequently, it proposes that the FDA work with sponsors and data curators to develop a publicly available list of prior successful validation methods and approaches that can be referenced for a variety of unique data sources.
This list could be based on lessons learned from work under the Sentinel Initiative and the FDA’s RCT Duplicate project, as well as the RWE provisions in the PDUFA VII commitment letter, Duke-Margolis said. “This list would ideally represent different types of data sources as well as a variety of disease focus areas.”
Both the certification process and the list of validation approaches potentially could help “to avoid the need to start from scratch each time validation or verification is required,” the Duke-Margolis comments state. “Absent a certification process, it should still be possible for sponsors to leverage prior validation and verification work on a given data source (supplementing with new work as necessary). The ability to build on prior high-quality work where appropriate will help make processes more efficient.”
Share Negative Assessments
In the guidance’s discussion of general considerations related to validation, the agency recommends assessing the performance of operational definitions in an adequately large sample of the study population as part of the proposed RWE study. “If sponsors propose to use an operational definition that has been assessed in a prior study, ideally those operational definitions assessed in the same data source and in a similar study population should be selected,” the guidance states.
The RWE protocol should include a detailed description of the planned validation, including justification for the choice of a reference standard, validation approach, methods, processes, and sampling strategy, if applicable.
“If a previously assessed operational definition is proposed, additional information should be provided, including in what data source and study population and during what time frame the assessment was conducted, the value of the assessed performance measures, and a discussion of whether the performance measures are applicable to the proposed study,” the guidance states.