data management and integrity
play

Data Management and Integrity Laboratory Practices Matt Davis and - PowerPoint PPT Presentation

Data Management and Integrity Laboratory Practices Matt Davis and Gaye Camm Senior Inspectors, Manufacturing Quality Branch Medical Devices and Product Quality Division, TGA 21 November 2019 What is data integrity? Data Integrity is the


  1. Data Management and Integrity Laboratory Practices Matt Davis and Gaye Camm Senior Inspectors, Manufacturing Quality Branch Medical Devices and Product Quality Division, TGA 21 November 2019

  2. What is data integrity? • Data Integrity is the extent to which data is: – Complete – Consistent – Accurate • Throughout the Data lifecycle: – Initial generation and recording – Processing – Use – Retention, archiving, retrieval and destruction • (PIC/S Good Practices for Data Management and Integrity PI 041) 1

  3. Creating the right environment • Data management controls embedded in PQS – System design to ensure good DI practices – QRM approach to data integrity – Ongoing risk review data criticality/risk Rationalisation – Self Inspection • Clear understanding of importance of data integrity at all levels of the organisation • Internal reporting is encouraged • Mature, open management approach to data integrity 2

  4. Risk management approach to data integrity • Data Criticality Data Data Risk Criticality – CQA Batch release data > cleaning records – Data relating to product quality/safety • Data Risk – Vulnerability of data to alteration, deletion, recreation, loss or deliberate falsification • Outcome - Effective control strategy to manage identified risks 3

  5. ALCOA+ Principles • Complete • Clearly • It must be • Data must • Data must • Data must Plus Attributable Legible Contemporaneous Original Accurate indicates possible to be recorded be correctly • Consistent who read or at the time it preserved in reflect the • Enduring recorded interpret the was its unaltered action / • Available the data or data after it generated state observation performed is recorded made • Close • If not, why the activity • Permanent proximity to not • Data • Signed / occurrence checked • No • Certified dated where unexplained copies necessary • Who wrote hieroglyphic it / when s • Modification s explained • Properly if not self- corrected if evident necessary 4

  6. Designing paper systems which reduce opportunities for falsification Workbooks, forms controlled, verified No pencil, white- ‘true copy’ scans out, soluble ink, SOP for corrections Accurate and archiving Original Signatures, Aliases; Contemporaneous signature logs Reflective of the Legible / observation; Data System design: Permanent checking, raw data documents in right verification place at right time, Attributable clocks on wall, control of blank forms 5

  7. Designing electronic systems which reduce opportunities for falsification Metadata which permits reconstruction Data security, audit trails; back- up; sys. validation Accurate Original User access control; e- Contemporaneous signatures; metadata Data capture; Legible / manual data entry; Permanent source data & Auto-saving; step- audit trail review wise recording; Attributable System clock synchronisation 6

  8. Overarching DMDI policy • Procedures • Validation • Audit Trails • Accurate and complete data • Secure data retention • Unique user names/password/biometrics • Time/date stamps • Prevent unauthorised changes • Traceable • Independent review • Correct movement of data • Detecting wrongful acts • Employee adherence • Training • Period review • Control of outsourced activities • Security • Corrective actions • Detecting non-compliance 7

  9. Analytical laboratories – Common concerns & controls • DI expectations: – Lab electronic systems – Data review processes – Audit review processes – Manual integration – Analyst training – Spreadsheet management – Test Injections 8

  10. Laboratory electronic systems Validation Configuration User Access Data management SOPs for user Software validation Audit Trails Data review SOPs access control Hardware Individual user Raw data OS security qualification access verification Configuration Data back- Defined user External calculation management up/archiving privileges tools Change Test method System Audit trail review management configuration administrator Periodic system E-signatures review 9

  11. Data review processes • Consider electronic and paper-based records • Clear SOP required for data review – frequency, roles, responsibilities and approach to risk-based review of data (and metadata, including audit trails as relevant) – Ensure that the entire set of data is considered in the reported data, should include checks of all locations where data may have been stored, including locations where voided , deleted , invalid or rejected data may have been stored • Who-When-What-How: – Who collected/when was it collected/what was collected/how was the data collected? – Who, when, what, how……data processed? – Who, when, what, how…...data reviewed? – Who, when, what, how…....data reported? 10

  12. Audit trail review processes • Technical Controls to aid secondary review of Audit Trails – Identifying data that has been changed or modified – Review by exception – only look for anomalous or unauthorized activities – By limiting permissions to change recipes/methods/parameters, or locking access to specific parameters or whole recipes/methods may negate the need to examine associated audit trails in detail as changes can be easily observed, restricted or prohibited – Whatever activities are left open to modification need to be checked 11

  13. Audit trail reviews - Common issues Reasons for change or deletion of GMP-relevant data not documented Annex 11 § 9: …for change or deletion of GMP-relevant data the reason should be documented • Method to record legitimate changes to data that needs to be considered when doing audit trail review, or covered by SOP i.e. allowable changes to methods. • Explanation for ALL data recorded (complete data) including results that aren’t reported • Deviations from standard procedures or atypical results should be investigated 12

  14. Reasons for changes to data… May be valid reason for invalidating data and repeating acquisition e.g. equipment malfunction, incorrect sample/standard preparation, power failure • Need to be recorded, investigated and potentially implement CAPA for invalid runs, failures, repeats and other atypical data • All data should be included in the dataset unless there is a documented scientific explanation for their exclusion • Possibly reviewed for trends at some timepoint • May need a new SOP for laboratories in addition to standard OOS/OOT procedures 13

  15. Audit trail challenges Use of paper copies for review of electronic systems Many laboratories create paper copies of electronic records in the form of reports and rely on conduct the audit trail review. These reports can be huge (> 80 pages per chromatographic run when audit trails are included) and the risk that critical notifications in the sea of data may be lost is a significant risk. 14

  16. Recording of audit trail review How do manufacturers demonstrate they have reviewed audit trails? • Documentation of audit trail reviews should be performed in a similar way to documenting any review process. Typically done by signing the results as ‘reviewed’ or ‘approved’, following a data review SOP. • For electronic records, this is typically signified by electronically signing the electronic data set that has been reviewed and approved 15

  17. Recording of audit trail review What about manufacturers who don’t have electronic signatures available? • Hybrid approach, which is not the preferred approach, using paper printouts of original electronic records – Requirements for original electronic records must be met. – To rely upon these printed summaries of results for future decision-making, a second person would have to review the original electronic data and any relevant metadata such as audit trails, to verify that the printed summary is representative. • It seems unreasonable to require specific evidence of exactly which records and metadata were looked at or opened (this would constitute an audit trail of the audit trail review) 16

  18. Manual integration controls • Automatic integration should be default manual only where absolutely required • SOP for integration required – Define methods that can and cannot be adjusted – Document which actions are permissible, or restrict access to only allow the desirable actions – Document both original and manually integrated chromatograms – Electronic signatures/audit trail for manually integrated peaks • Review results to ensure compliance – Review reported data against electronic raw data (including audit trails) 17

  19. Staff Training • Essential that reviewers have a good knowledge of how the computer system or software works, and how the audit trail is designed and works together with the data • This may require specific training in evaluating the configuration settings and reviewing electronic data and metadata, such as audit trails, for individual computerized systems used in the generation, processing and reporting of data. • Include training on system vulnerabilities such as overwritten or obscured through the use of hidden fields or data annotation tools 18

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend