Data Management and Integrity
Laboratory Practices
Matt Davis and Gaye Camm Senior Inspectors, Manufacturing Quality Branch Medical Devices and Product Quality Division, TGA
21 November 2019
Data Management and Integrity Laboratory Practices Matt Davis and - - PowerPoint PPT Presentation
Data Management and Integrity Laboratory Practices Matt Davis and Gaye Camm Senior Inspectors, Manufacturing Quality Branch Medical Devices and Product Quality Division, TGA 21 November 2019 What is data integrity? Data Integrity is the
Laboratory Practices
Matt Davis and Gaye Camm Senior Inspectors, Manufacturing Quality Branch Medical Devices and Product Quality Division, TGA
21 November 2019
– Complete – Consistent – Accurate
– Initial generation and recording – Processing – Use – Retention, archiving, retrieval and destruction
1
Rationalisation
2
Data Criticality Data Risk
3
Attributable
indicates who recorded the data or performed the activity
dated
it / when
Legible
possible to read or interpret the data after it is recorded
unexplained hieroglyphic s
corrected if necessary
Contemporaneous
be recorded at the time it was generated
proximity to
Original
be preserved in its unaltered state
not
copies
Accurate
correctly reflect the action /
made
checked where necessary
s explained if not self- evident
Plus
4
System design: documents in right place at right time, clocks on wall, control
Signatures, Aliases; signature logs
Workbooks, forms controlled, verified ‘true copy’ scans Reflective of the
checking, raw data verification No pencil, white-
SOP for corrections and archiving
5
Auto-saving; step- wise recording; System clock synchronisation User access control; e- signatures; metadata Metadata which permits reconstruction Data capture; manual data entry; source data & audit trail review Data security, audit trails; back- up; sys. validation
6
7
8
E-signatures Audit trail review External calculation tools Raw data verification Data review SOPs
Data management
System administrator Defined user privileges Individual user access SOPs for user access control
User Access
Test method configuration Data back- up/archiving OS security Audit Trails
Configuration
Periodic system review Change management Configuration management Hardware qualification Software validation
Validation
9
– frequency, roles, responsibilities and approach to risk-based review of data (and metadata, including audit trails as relevant) – Ensure that the entire set of data is considered in the reported data, should include checks of all locations where data may have been stored, including locations where voided, deleted, invalid or rejected data may have been stored
– Who collected/when was it collected/what was collected/how was the data collected? – Who, when, what, how……data processed? – Who, when, what, how…...data reviewed? – Who, when, what, how…....data reported?
10
– Identifying data that has been changed or modified – Review by exception – only look for anomalous or unauthorized activities – By limiting permissions to change recipes/methods/parameters, or locking access to specific parameters or whole recipes/methods may negate the need to examine associated audit trails in detail as changes can be easily observed, restricted or prohibited – Whatever activities are left open to modification need to be checked
11
when doing audit trail review, or covered by SOP i.e. allowable changes to methods.
aren’t reported
investigated
12
runs, failures, repeats and other atypical data
scientific explanation for their exclusion
procedures
13
14
similar way to documenting any review process. Typically done by signing the results as ‘reviewed’ or ‘approved’, following a data review SOP.
electronically signing the electronic data set that has been reviewed and approved
15
paper printouts of original electronic records – Requirements for original electronic records must be met. – To rely upon these printed summaries of results for future decision-making, a second person would have to review the
audit trails, to verify that the printed summary is representative.
and metadata were looked at or opened (this would constitute an audit trail of the audit trail review)
16
– Define methods that can and cannot be adjusted – Document which actions are permissible, or restrict access to only allow the desirable actions – Document both original and manually integrated chromatograms – Electronic signatures/audit trail for manually integrated peaks
– Review reported data against electronic raw data (including audit trails)
17
how the audit trail is designed and works together with the data
reviewing electronic data and metadata, such as audit trails, for individual computerized systems used in the generation, processing and reporting of data.
through the use of hidden fields or data annotation tools
18
Spreadsheets for managing and presenting data, and their versatility and ease of use has led to wide application. When data contained within the spreadsheet cannot be reconstructed elsewhere and is essential to GMP activity then the data governance measures need to be rigorous.
made an entry, whether entries have been over-written and replaced (permanent), and when the data entries had been made (Contemporaneous).
there may be different versions in use (Original).
without being detected (Accurate).
19
depend on factors such as the complexity of the analysis, the age and condition of the column, and detector lamp warm-up time.
development/validation/verification/ transfer work performed in the laboratory and this should be documented in the analytical procedure.
The solution container label needs to be documented to GMP standards and clearly identified for the explicit purpose
procedure.
sequence file as a system evaluation injection. If the SST criteria are met then the system is ready for the analysis.
for the run.
20
21
Competence/supervision Lack of effective controls Secondary Checks Computerised system configuration Organisational Culture / resources
No testing conducted Not counting all colonies OOL data not being investigated Resampling/retesting without justification
Samples not taken or “lost” in transit No reconciliation of samples Incubation conditions incorrect Using unvalidated test methods
Not recording all key test data Worksheets ripped up and replaced No reconciliation of forms used Lack of proper computerised system security Colony morphology not matching identification results
Contributing causes
22
Sampling schedule/plans Training of technicians Sample forms Detailed collection methods Identity of sampler recorded
Test volumes/weights recorded Calibrated equipment used Reference to all reagents Reference to validated methods/dilution factors Samples processed under clean conditions, e.g. LAF Negative controls for processed samples Identity of tester/equipment recorded
Incubation records maintained Min/max incubation time defined and validated All transfers/sub-culturing recorded All incubated samples tagged and identified
Technicians trained in detection, enumeration and morphology – clear SOPs, photos Controlled environment for reading, light, magnification Counting device used for colonies Clear acceptance criteria/limits OOL & ID policy for manual recording All samples reconciled Results recorded Calculations applied correctly Second checks and verification in accordance with quality risk management
23
24
and unaware of the gap UNCONSCIOUS INCOMPETENCE
yet able to deal with it CONSCIOUS INCOMPETENCE
problem but only with effort CONSCIOUS COMPETENCE
automatic UNCONSCIOUS COMPETENCE
25