In a recent blog post, I wrote about how manufacturers can overcome data integrity problems that potentially imperil an application’s approval and commercializing a product. Sometimes business circumstances intervene and remediating the problems after acquisition is the only viable option. Far better is to be able to avoid those types of circumstances by clearing an application of data integrity issues prior to submission.
Recent news headlines have featured several prominent statements from FDA – including the Acting Commissioner – reiterating the necessity for manufacturers to submit data to the agency that have the necessary level of quality and fitness for use in regulatory submissions. This includes data that are considered unreliable either because the manufacturer was negligent or intentional.
From the Acting Commissioner’s perspective, if a company submits an application that contains false or inaccurate data several bad outcomes are the result - it undermines the search for a treatment or cure, violates the public trust, raises costs, exposes people to needless therapies, gives science a bad name, and most importantly, it’s bad for patients.
The violation of the public trust and mission of the FDA is particularly acute since the agency does not have the resources to referee all the data contained in regulatory applications. The FDA is in a position where it must trust applicants at some level.
Many publications that discuss the performance of due diligence for healthcare product transactions include information regarding the scope of regulatory due diligence. This typically involves an examination of various regulatory filings, regulatory correspondence, and manufacturing compliance audits and any remediation activities.
Once the regulatory filings and correspondence have been identified, it is commonplace to then scrutinize the content to identify issues, compliance with applicable health authority regulations and policy (typically based on the product quality framework outlined in Modules 2 & 3 of the Common Technical Document (CTD)) and recommend alternatives to defending the positions taken in the application.
What may not be as readily apparent is the need to closely examine the underlying pedigree of the data contained in the various regulatory filings – not just the presentation of those data. The pedigree of the data needs to be assured at all levels of generation and reporting. Applicants may have a final quality control check of all the data in the application to confirm that they are accurately aligned with data contained in various development reports and executed protocols.
However, the pedigree of the data needs to be assured to the level of all applicable raw data generated and collected during the development program since those data are then incorporated into various reports. If the pedigree of the data cannot be assured from the application to the stage at which it was generated, companies can later face adverse consequences if there are breaches in that pedigree and the data contained in an application vary from raw data prior to their incorporation into reports.
A manufacturer faces a risky scheme when either through carelessness or intent raw data in reports differ from those originally generated and collected during development. This is particularly important for companies that are acquiring assets that are in the late stages of development shortly before a marketing application is planned for submission or after an application has been submitted and is under review when the change in control is executed. As the recent headlines exemplify, if the FDA determines that an application at either of these stages is adversely affected by data integrity issues, the consequences can be severe and extremely disruptive.
Admittedly, undertaking this type of activity can be onerous when the timelines for deals are compressed and when an acquiring company needs to examine vast amounts of data and information that could have been generated over years by several contract development and manufacturing organizations. It’s not uncommon for some late stage programs to have spreadsheets that run tens of thousands of lines to catalog the entire inventory of development activities and reports.
At a minimum, an acquiring organization should consider having an approved and ready to execute protocol in place for the due diligence of an application’s data pedigree and clear policies in place for communication with a health authority if problems arise during the due diligence exercise.
The FDA will also be scrutinizing the data at both these levels. The FDA recently published the revised Compliance Program Guidance Manual (CPGM) 7346.832 for Pre-Approval Inspections (PAI). The revised CPGM will become effective September 16, 2019. You can find a copy of the CPGM here.
Two of the three inspectional objectives include evaluating data for its fitness – conformance to application and data integrity audit. The conformance to application tests whether the application is aligned with source documents on site by instructing investigators to verify that the formulation, manufacturing methods, analytical methods and batch records are consistent with descriptions contained in the CMC section of the application. The FDA investigators will also audit and verify raw data at the facility to help authenticate the data submitted in the CMC section of the application as relevant, accurate, complete, and reliable for FDA assessment.
Don’t put yourself in a position where FDA is driving the examination of the pedigree of the data in your regulatory applications and on site at manufacturing facilities. Take the FDA at its word - less is more when it comes to manufacturers in the headlines for these issues.