Exploring the FDA’s latest proposed guidance on computer system validation, and the movement toward a less burdensome approach
In 1979, when the FDA began enforcement of 21CFR 210/211, a modern version of the good manufacturing practices (GMPs) for drug manufacturing, the pharmaceutical industry responded by creating a set of formal, documented internal practices and procedures to ensure that their manufacturing equipment, systems and processes met the requirements defined in these laws. The industry’s execution of these practices and procedures, to ensure compliance to the GMPs, became known as validation, or process validation.
When microprocessors and computers became more involved in manufacturing processes in the 1980s, the FDA generated specific guidance for validating computerized systems. Because of the inability to visually view much of the software’s operation, the complexity of some of the software programs, and the lack of familiarity of computers’ structure and operation, early computerized system validation (CSV) guidance promoted the generation of massive amounts of formal testing and documentation to achieve compliance. This resulted in arduous and sometimes difficult validation efforts that were, and still are, burdensome to many projects that involve computerized GMP equipment and systems.
The amount of additional work required to meet the agency’s computerized system GMP guidance motivated most life science companies to create specialized CSV departments to generate, execute and approve the tests and other documents needed to verify that these computer programs and systems were installed and operating as specified. The CSV effort had become a significant portion of the cost and time required to implement new or modified equipment and systems in the life science industry, in many cases becoming a barrier to process improvement.
In response to the expensive and time-consuming CSV processes, the FDA has been pushing industry to adopt a “least burdensome approach to validation” for the past several decades, with less-than-desired results. The latest FDA guidance that proposed significant changes in the industry’s approach to validation, the Pharmaceutical CGMPs for the 21st Century—A Risk-Based Approach,1 was released in 2004. This guidance promoted the use of the latest technology, adoption of the newest quality methodologies and development of a risk-based assessment of system functions to achieve GMP compliance more efficiently than the methods used in the 1990s and earlier.
The industry has been slow to adopt the suggested changes in this guidance due to the initial cost to employ new technology and processes, the perceived lack of maturity of some of the technology, and the resistance of companies to change their processes from the FDA “known and accepted” validation methodologies that they have successfully employed for years.
The FDA’s latest proposed guidance on computerized system verification and validation, tentatively named Computer Software Assurance for Manufacturing, Operations, and Quality System Software (CSA), is designed to help the industry overcome the obstacles that prevent the application of the least burdensome approach to validating computer-based equipment or systems. The proposed guidance will present the latest views that the agency has on testing and documenting requirements for non-product software used in the drug manufacturing process by presenting a new approach that emphasizes critical thinking early in the process to develop a verification plan that assures the software meets its intended use.
This move from a CSV model to a computer software assurance model is designed to move the industry from the current validation process of completing a prescribed set of documents for every CSV effort, to an intelligent approach that utilizes input from all stages of development and implementation to verify that the system meets its intended use.
Computer software assurance offers a streamlined and efficient approach
Although the proposed CSA guidance does not modify or supersede previous GMP guidance, and it does not change any existing predicate rules or 21CFR Part 11, it presents a different approach to meeting regulatory compliance than previous guidance and identifies two new strategies that manufacturers may want to use to assure the quality of software in GxP applications:
First, the new guidance emphasizes the verification and validation process to prove a system’s fitness for use rather than depending on the current method of generating a mandatory set of deliverables to achieve this goal. The proposed guidance supports a CSA program that uses rapid learning and continuous improvement to create plans to achieve compliance rather than the existing CSV programs that are locked into achieving compliance through the generation of a prescribed set of documentation, with little motivation for improving the validation process.
Second, the new guidance includes expanded instructions on the use of automated testing as well as recommendations for the acceptable use of vendor documentation and testing in the formal documentation of system installation and operation. This approach reduces the amount of redundant documentation and testing generated during validation, which increases the efficiency of collecting and documenting the specifications, installation and testing required to verify that a system is fit for use.
In addition to making these suggestions to improve the efficiency of assuring that the equipment or system software is fit for its intended use, the FDA Center for Devices and Radiological Health’s (CDRH) presentation on the draft CSA guidance proposed a four-step process to identify and apply the least burdensome approach for achieving computer software assurance. These four steps are:
- Identify intended system use in the specifications by listing all requirements that affect product safety, quality, efficacy or identification.
- Use risk-based assessments to identify each feature, function or operation’s risk to product safety or quality and identify appropriate activities for each risk level to ensure that they reliably perform as intended. These activities could occur in any stage of the system development and implementation, based on the feature, function or operation’s level of risk.
- Leverage and use existing activities and supplier data, as appropriate, to assure proper system operation. This includes automated testing results, electronically collected data and records, vendor documentation/testing generated during system development and agile and unscripted testing created during unit and system testing and commissioning. Employ process controls to mitigate risk when possible, and use any information gathered on the system from the beginning of the selection process until the system is approved-for-use to assure system operation.
- Define and document appropriate records to decrease the focus on creating documentation “to meet compliance standards.” Collect data that is useful for the manufacturer to verify that the system operates reliably and as specified; do not create or collect data for the sole purpose of satisfying an auditor.
The benefits of a CSA guidance
The proposed guidance reiterates the FDA’s desire to have life science companies implement quality and testing efficiencies that other industries have already employed to drive down product costs and increase product quality. The agency expects the CSA guidance to move the needle for the pharmaceutical industry more than the previous guidance because it presents a detailed road map that uses critical thinking early in the process, implements risk-based assessments during planning and leverages vendor documentation and testing during validation. The goal is to change the paradigm of computerized system validation from a massive, document-driven practice with redundant testing into one that requires reduced amounts of testing and documentation to achieve the same results—that the software is assured to perform as expected during commercial operation of the system.
So, is CSA a game changer?
I know of several companies that have used “CSA-type” strategies for years, confident in their ability to show adherence to the Code of Federal Regulations (CFRs) as they reduced the amount of testing and documentation using aggressive risk-based approaches. There are also a few companies that completed pilot CSA programs at their facilities, using the guidance to significantly reduce the effort and cost of verification activities. But most organizations continue to follow their established CSV programs because of the barriers mentioned earlier.
This guidance, providing implicit FDA approval for the use of these methods, as well as some direction on how to implement them, should eliminate most of the apprehension that the industry has in moving to a CSA methodology to verify that computerized systems perform as intended. The guidance’s promotion of critical thinking, rigorous risk-based assessments and use of automated testing and vendor deliverables should motivate all companies to embrace an efficient “FDA-approved” process that assures software quality for any non-product software.
This result could finally permit the FDA and industry to announce that they have actually defined the long-desired goal of a “least burdensome approach to validation,” which will most certainly lower the cost and time required to develop and implement (or modify) GMP equipment and systems that use non-product software without sacrificing quality. These results would definitely make the CSA guidance a true game changer for the industry.
About the Author
Brian Stephens is Assistant Director at CAI.
1. Pharmaceutical CGMPs for the 21st Century—A Risk-Based Approach (Final Report), https://www.fda.gov/media/77391/download