Quality by Design for Laboratory Automation
A closer look at process understanding, process analytical technology and risk analysis implications for the modern lab (part 2 of 2)
As we learned in part 1 of this series,1 Quality by Design (QbD) is a conceptualization rapidly gaining traction in the biomedical industries. Led by a U.S. Food and
![]() |
Drug Administration (FDA) initiative to reduce costs, increase quality control and shift the primary responsibility for quality assurance (QA) firmly onto the industry, QbD principles are increasingly being adopted by automated laboratories. Whereas part 1 defines the key variables and their relationships, this installment further details the definitions and parameters of process understanding, process analytical technology (PAT) and risk analysis, and discusses the implications for modern laboratories.
Process understanding
In a laboratory setting, the process understanding step is generally included in the design of the laboratory, the selection of the laboratory equipment and/or the design of the laboratory experiment. It is in those planning stages that critical decisions are made about the collection and processing of samples, the assays to be conducted and the interpretative analyses to be applied to those assays. These design space decisions are summarized in the study protocol and should include the decision criteria for acceptance or rejection (or re-analysis) of a test result. Appropriately, that decision criteria includes a range of acceptable and unacceptable result measurements.
While there is a wide range of laboratory testing models, most will include four steps: • Sample manipulation and control — identification, plating or other maneuver to mix sample with reagent and to enter sample into device; or to examine sample under varying magnification or manipulative conditions • Testing and data measurement — process of determining the reaction of the sample to the manipulation and numerical characterization of that reaction • Data collection — the reporting of those measurements to a database carefully tied to the sample identification • Data analysis — statistical interpretation of the collected data to provide graphic, numerical and/or label results When carefully and operationally defined, these four steps provide a composite picture of the design space and a clear delineation of the process or processes.
Process analytical technology
PAT has been recently endorsed by the FDA as a major component of manufacturing and analysis quality control.2 Widely used in other industries (particularly chemical and petroleum), and used under a variety of different names in the pharmaceutical industry, PAT is a process of monitoring the performance of variables and functions identified in the process understanding procedures.
PAT has three basic components: two that are critical to the definition of process analytical control, and a third that represents a secondary implication with powerful value. A PAT system is characterized by continuous measurements,3 cybernetic responses to those measurements and, potentially, the capability of remote monitoring.
It is, of course, possible to design a process without any quality checkpoints until the final stage and to, presumably, quarantine the end product of that process pending successful passing of a final stage test. The processing of human blood, for example, follows this model. A blood bag and two test tubes are collected (one tube is for testing, the second for archive) and identified. After the processing of the bag, which takes only a few minutes, that bag of whole blood, plasma, cells, etcetera is refrigerated and stored awaiting the longer testing process. Meanwhile, and for several hours thereafter, the test tube of blood is examined for possible HIV, hepatitis and other contaminations. When the tests are completed, the quarantined bag is retrieved and either labeled as safe for use or destroyed. While a necessary procedure in this instance, such a process is dangerous and inefficient — quarantine errors, mislabeling and other problems can have serious consequences.
In most circumstances, there is a safer, higher-quality alternative. Under a PAT system, measurements of product purity and other characteristics can be collected at every stage of the process. The manufacture of penicillin, for example, requires the fermentation for several weeks of a mixture of grain, animal blood, water and spores. Variations in temperature, pressure or appropriate mixture agitation can kill the spores and prevent the growth of the penicillin. With continuous measurements, however, those variations can be quickly noted and corrected.
Better than having a human quality control professional note the variations on a gauge and make the appropriate adjustment is to have the automated system, using a thermostat or equivalent, self correct the out-of-spec measurement. This cybernetic process represents the second leg of the PAT procedure. In the penicillin fermentation chamber, for example, sensors can increase or decrease temperature, adjust the pressure, and turn on or off the agitator without awaiting the action of a quality control person. The result is much more rapid, more certain and less subject to human error.
The final PAT characteristic is not fundamental to the concept, but may represent a secondary benefit. Because PAT, in effect, automates a part of the quality control process, the use of PAT permits remote monitoring of a laboratory or manufacturing system. Using the Web or another connection, it is possible to centralize monitoring in a single station, with subsequent seamless human oversight and easy access by invited outside experts. With a remote monitoring system, an organization could consult with an expert on a particular problem, permit temporary access by FDA regulators to discuss a strategy, and lower monitoring costs with a centralized facility. The power industry has implemented this model: GE Energy has constructed a centralized monitoring station (located in a suburb of Atlanta, GA) that monitors processes and safety at power plants around the world.4 While the remote capability does require some access control and security restrictions, and also may require special FDA access policies, it does represent a valuable step in the QbD concept. With remote monitoring, continuous monitoring and cybernetic controls, PAT presents the measurement of the key process points identified and is a critical piece of the QbD picture.
Risk analysis
The concept of risk analysis was first borrowed from the review process for medical devices in 2001 and applied to 21 CFR Part 11, the regulation of computer systems.5 While Part 11 theoretically applied to all computer systems in use at a pharmaceutical organization, emphasis was clearly placed on the high-risk systems in which a control error would directly impact human health and safety.
In recent years, the FDA has applied the risk analysis to all areas of regulation.6 As the third critical element in quality by design, the risk analysis allows prioritization of the quality measurements, guiding the process of determining which variables are most important and the degree of variance acceptable for each variable.
Once there is a clear understanding of the laboratory testing (or manufacturing, or development) process, and a clear scheme for the measurement (using a PAT approach) of that process, the system manager is left with a dilemma. With a thorough identification of variable, and near-constant measurement of those variables, there is a vast quantity of data containing minor variations in values. Is a two-percent variance in the pill stamping pressure significant? Does a small spray variant on a capsule label painter have a real impact? Does a .005 percent evaporation on a liquid formulation adversely affect product strength?
To answer these kinds of questions, and to focus on the important variations without drowning in the trivia or insignificant deviations, a risk assessment can provide defensible parameters and decision criteria. In the broadest terms, those variations which impact health and safety are important, while those that do not are generally considered insignificant.7
QbD implications
The use of a QbD approach is not currently mandatory, and the concept is still evolving. The first trials of QbD submissions are currently underway, and most quality by design attention has focused on the manufacturing process. Application of QbD to laboratory settings is just beginning to emerge, and formal guidelines and regulations are probably five to eight years away. However, the consideration of a QbD construct in planning, operating and evaluating a laboratory operation has immediate value. In fact, it is likely that many laboratories will adopt QbD well in advance of any FDA mandate, yet another instance of the industry driving new regulation.8
For, regardless of the state of FDA acceptance or requirement, quality by design provides a management framework with significant financial incentives to make it attractive. Quality problems detected post-application require product rejection (in a quarantine system) or recycling (in a fill line, for example). In a laboratory, post analysis quality errors, at best, require a rerun of a test or experiment. At worst, the problem could corrupt results and lead to expensive erroneous conclusions.
With a quality by design approach, the laboratory process begins with a clear analysis of the test process — a step critical to system design, equipment selection and experimental design. It next implements a continuous stream of quality checks, rapidly signaling significant variants from experimental design, equipment problems, operating failures and process errors. Those problems may be cybernetically self-corrected or may warn the laboratory operator to halt the process until a correction can be implemented.
Finally, a QbD approach provides a risk assessment that identifies the appropriate focus of the laboratory operations and provides a defensible justification for tolerances of variations. The effect may be extended life of otherwise arbitrarily discarded reagents, useful data from slightly contaminated samples and a more efficient laboratory operating procedure.
While a cost justification of a quality by design approach must await more diverse industry experience with QbD implementation in laboratories, preliminary results for QbD in manufacturing suggest a rapid payback and a strong cost/benefit ratio.9
Summary
Over the next 10 years, quality by design is likely to evolve into the dominate paradigm for biomedical quality. Current applications of QbD are rapidly infusing the manufacturing industry. As initial trials prove successful, QbD submissions are likely to dominate that sector over the next two to three years. Laboratories are adopting QbD procedures, with formal guidelines and regulations anticipated approximately five to eight years from now. Applications of QdB to clinical studies, already in exploratory stages, should reach critical levels within the decade.
The rationale behind the expected evolution to QbD dominance is based upon two critical factors: • First, quality by design is comprehensive. It incorporates all three aspects of quality control and assurance: an approach to identification and analysis of the processes underlying the system, a methodology for measuring and cybernetically controlling those processes, and a risk-based prioritization and interpretation of those measurements.
• Secondly, quality by design is logically consistent. It provides a rational framework for controlling quality in production, research, laboratories and other requirements, with a reasonable and defensible system of tolerances and permissible variants. QbD makes good sense for the management of an automated laboratory — understand the processes involved, measure the performance of those processes and interpret those measurements in a risk construct.
With reasonable assurance, a QbD approach to laboratory quality control and assurance will be the dominant paradigm of the future.
References
1 S. Weinberg, “Quality by Design for Laboratory Automation: A look at key variables and their relationships (part 1 of 2),” Scientific Computing, November/December 2008.
2. S. Weinberg, “Process Analytical Technology for Chromatography,” Journal of Chromatographic Science, V. 44, March 2006.
3. Actually, PAT measures are not continuous but are rapid multiple discrete measurements – but they are treated as though they are continuous in most analyses.
4. Private tour, June 2006.
5. S. Weinberg, “The FDA Vector: Part 11 Risk-Based Changes,” American Biotechnology Laboratory, May 2003.
6. Sean Ekins, ed., Computer Applications in Pharmaceutical Research and Development, “Regulation of Computer Systems” (Chapter), S. Weinberg, Wiley, 2006.
7. Exception: some minor variations, not directly affecting health and safety, serve as “coal mine canaries,” warning of potential future impact on important variables.
8. See, for example S. Weinberg, “Cost-Effective Compliance,” Scientific Computing and Instrumentation, March 2003.
9. Nee, R.D., L.B. Hare and J.R. Trout, Experiments in Industry — Design, Analysis and Interpretation of Results, Quality Press, 1985.
Sandy Weinberg is an Associate Professor of Health Care Management at Clayton State University and a Senior Consultant at Tunnell Consulting. He may be reached at [email protected].
Acronyms
FDA U.S. Food and Drug Administration | PAT Process Analytical Technology | QA Quality Assurance | QbD Quality by Design