This is the fourth and final part of a series reviewing and critiquing the recent Medicines and Healthcare products Regulatory Agency (MHRA) guidance for industry document on data integrity.1 The first part of the series2 provided a background to the guidance document and discussed the introduction to the document. The second part reviewed the data governance system,3 and the third part discussed data criticality and the data lifecycle.4 This part reviews the system design, some of the definitions, and finishes with an overall assessment of the guidance.
Designing Systems to Assure Data Quality and Integrity
This portion of the MHRA guidance1 consists of two sections. The first is a list of bullet points for the design of systems, and the second is a discussion on scribes for documenting GMP activities.
Turning to the first section, my view is that many of the bullet points are poorly written, with some basic errors. Below are the bulleted points from the MHRA document, and underneath each one are my comments and critique:
- Access to clocks for recording timed events.
This is a poorly written item, as it implies that anyone or any system can access a clock, be it for a manual process or a computer process. But this is anybody’s guess. I think that this is intended for computerized systems rather than manual processes, unless there is a test such as loss on drying (LOD). What this point should make is that an application needs access to the system clock to provide the time and date stamp for events within it. By implication, this means that workstations should be networked to ensure that the time stamp can maintain the accuracy from the time server that is linked to a trusted time source, so that manual intervention is not required. However, the main issue is that access to the system clock must be restricted to authorized individuals to prevent time traveling and data falsification. - Control over blank paper templates for data recording.
Perhaps a better phrasing of this requirement is to be found in the FDA 1993 guide on Inspection of Pharmaceutical Quality Control Laboratories: We expect raw laboratory data to be maintained in bound (not loose or scrap sheets of paper) books or on analytical sheets for which there is accountability, such as pre-numbered sheets.5 Far more succinct and to the point. - User access rights which prevent (or audit trail) data amendments.
Perhaps a better way to express this is that user types / roles need to be defined and documented along with the corresponding access privileges per role. In addition, access privileges that enable a user to modify or delete records need to be justified. Where the access privileges allow either data modification or data deletion, these need to be monitored by the audit trail in the application. - Automated data capture or printers attached to equipment such as balances.
Put at its most basic: inspectors do not trust people to make manual observations of critical data from analytical balances. They want independent verification of the weights of reference standards and samples used in analytical procedures. Indeed, standalone balances without printers may have been acceptable 30 years ago, but no longer due to cases of data falsification, but also human error. Analytical balances with a printer are now the status quo, see the discussion in Part 3 of this series.4 However, what about other instruments such as a pH meter for checking mobile phases or that buffers have been made up correctly — is a printer necessary? Enter stage left a risk assessment! - Proximity of printers to relevant activities.
This applies mainly to hybrid systems as, if data are acquired, processed and reported electronically with electronic signatures, the need for a printer in proximity to the activity diminishes. - Access to raw data for staff performing data checking activities.
This is similar to the FDA GMP requirement for complete data and for the second person review to see all data generated in the course of an analysis.3,9
On the second section, my advice on scribes in a normal laboratory environment is don’t use them, as this would cause more compliance problems than it solves. Furthermore, there is no equivalent position from the FDA on the subject.
Definitions and Expectations Associated with Data
There are 19 definitions in the MHRA document,1 this critique will only focus on three of them: raw data, metadata and data. The problem with these three is that we have a surfeit of data and little information about how they link together and this, I would suggest, is a major omission from the guidance: figures are better for putting context around some of the key definitions. For the purposes of simplicity, I have not included the regulatory expectations, although the criteria for data integrity (ALCOA+) was discussed earlier in this article.1
Table 1 lists the three MRHA definitions for raw data, metadata and data from the guidance document.1 These definitions are presented in the document, but are not really linked with what happens in practice in the laboratory. The principle of EU GMP Chapter 4 is more informative: Records include the raw data which is used to generate other records.6 Therefore, by regulatory definition, we need to consider far more than just raw data and the associated contextual metadata, but also the processed or interpreted data derived from them, as well as the generation of the reportable result. Furthermore, as mentioned earlier, data cannot be considered as information. As such, the MHRA definitions should be revised again to reflect these concerns.
Table 1: MHRA Definitions for Raw Data, Metadata and Data
Word |
MHRA Definition1 |
Raw Data |
|
Metadata |
|
Data |
|
What do these definitions mean in practice? Let us look at three options shown in Figure 1:
- a paper-based test using observation with documentation by writing in a laboratory notebook
- a test conducted using a hybrid system
- and, finally, one using electronic workflows and electronic signatures
How do these different tests link with the three definitions? These three tests are also broken down in Table 2 into raw data with the associated metadata, processed data, information and knowledge. The latter two topics are either misunderstood in the MHRA guidance (information) or not mentioned (knowledge) which is subject to a separate paper on the subject.7
The first example is an observation of a test for example color or odor, this is written into a laboratory notebook or a controlled sheet. The second example is a hybrid system where observations generate electronic records and the metadata are written, as well as contained within the application, generated data are manually typed into a spreadsheet for calculation of the reportable result. The last example is an electronic system where all activities are contained within the application and underlying database. The reportable result is electronically signed by the tester and the reviewer.
Table 2: Records Associated with Manual Observation and Hybrid and Electronic Systems
Record |
Observation |
Hybrid |
Electronic |
Raw Data |
|
|
|
Metadata |
|
|
|
Processed data |
|
|
|
Information |
|
|
|
Knowledge |
|
|
|
The aim of Table 2 and Figure 1 is to illustrate that simply presenting a series of definitions, even with regulatory expectations, is not enough. Context and explanation is all, and figures help understanding. In the MHRA document, Figures 2 and 3 show how to and how not to record data contemporaneously for a manufacturing system, the same approach should have been taken with many of the other definitions, as a picture is worth a thousand words.
Overall Assessment
My overall assessment of the MHRA data integrity guidance1 is that it is good, but not good enough, and needs improvement as we have discussed in this series of articles.
It is good and provides a risk-based approach. There is more information on the data governance system than is provided on the MHRA Web site when first announced in 2013.8 It also identifies the responsibilities of data owners and senior management in relation to data integrity. However, the guidance still is in need of improvement, such as it confuses data with information. A figure is needed to link together several related definitions. The section on design controls is poorly written and needs expansion to clarify what is required. As MHRA has shown a willingness to listen to comments from industry and has updated the document in a short time frame, my hope is that these articles along with other comments provide additional input to the review process.
References
- MHRA GMP Data Integrity Definitions and Guidance for Industry version 2, March 2015
- R.D.McDowall, Scientific Computing, Part 1 http://www.scientificcomputing.com/articles/2015/05/review-and-critique-mrha-data-integrity-guidance-industry-%E2%80%94-part-1-overview?cmpid=horizontalcontent
- R.D.McDowall, Scientific Computing, Part 2 http://www.scientificcomputing.com/articles/2015/05/review-and-critique-mrha-data-integrity-guidance-industry-%E2%80%94-part-2-data-governance-system?cmpid=horizontalcontent
- R.D.McDowall, Scientific Computing, Part 3 http://www.scientificcomputing.com/articles/2015/05/review-and-critique-mrha-data-integrity-guidance-industry-%E2%80%94-part-3-data-criticality-and-data-life-cycle?cmpid=horizontalcontent
- FDA Guide to the Inspection of Pharmaceutical Quality Control Laboratories, 1993
http://www.fda.gov/ICECI/Inspections/InspectionGuides/ucm074918.htm - EU GMP Chapter 4 Documentation, 2011
http://ec.europa.eu/health/documents/eudralex/vol-4/index_en.htm - EU GMP Annex 11 Computerised Systems, 2011
http://ec.europa.eu/health/documents/eudralex/vol-4/index_en.htm - C Burgess and R D McDowall. LC-GC Europe, scheduled September 2015.
- MHRA data integrity expectation Dec 2013
Web reference archived!
R.D. McDowall is Director of R D McDowall Ltd. He may be contacted at [email protected].
Related Content
- Review and Critique of the MRHA Data Integrity Guidance for Industry — Part 1: Overview
- Review and Critique of the MRHA Data Integrity Guidance for Industry — Part 2: Data Governance System
- Review and Critique of the MRHA Data Integrity Guidance for Industry — Part 3: Data Criticality and Data Life Cycle