modular programs
play

Modular Programs Sebastian Schneeweiss, MD, ScD Jennifer Nelson, - PowerPoint PPT Presentation

Modular Programs Sebastian Schneeweiss, MD, ScD Jennifer Nelson, PhD Mini-Sentinel Methods Core January 31, 2013 info@mini-sentinel.org 1 Modular approach to drug safety monitoring in a distributed database system Principal idea


  1. Modular Programs Sebastian Schneeweiss, MD, ScD Jennifer Nelson, PhD Mini-Sentinel Methods Core January 31, 2013 info@mini-sentinel.org 1

  2. Modular approach to drug safety monitoring in a distributed database system  Principal idea • Pre-programmed modules can be quickly activated to run adjusted analyses across data partners • For monitoring, modules will be run repeatedly as data are refreshed  Some specifications • Validated programming code • Can be run asynchronously across data partners as data get refreshed while preserving data privacy • Confounding adjustments via self-controlled designs, PS matching or regression analyses • Estimate ratio and difference measures (rate or risk) • Sequential (or group sequential) analyses info@mini-sentinel.org 2

  3. Prospective surveillance: estimate risk Newly marketed product Define exposures, outcomes, etc Choose analysis approach Risk estimation Estimate the risk Module 1 Module 2 Module 3 Self-controlled Cohort matching Cohort regression info@mini-sentinel.org 3

  4. Prospective surveillance: estimating risk Module 1 Module 2 Module 3 Index identification Cohort identification (MP3) Cohort Matching Self-controlled Cohort Regression Key parameters: Key parameters: Key parameters: -Score-based -Exposure crossover -Regression matching (PS, DRS) -Risk and control -IPT weighted window -hd-PS regression -1:1/variable ratio -Exposure time trend -Tailored to the rare -AT vs. ITT adjustment event setting Sensitivity analyses - Popn. Subgroups - Dose subgroups info@mini-sentinel.org 4

  5. Module 2 in detail Coordinating center Multiple data partners Specify input parameters • Identify Cohort, • Outcomes Transmit code Start Module 2 • Covariates Calculate • confounder scores (PS, hd-PS, DRS) Evaluate diagnostics Transmit data • Run diagnostics and aggregate data Create de-identified • across partners result files Apply alerting algorithms and interpret results Iterate at next data refresh info@mini-sentinel.org 5

  6. Diagnostics: Balance before matching Table 1. Cohort of New Initiators of Rofecoxib and Non-Selective NSAID (Unmatched) Primary Analysis Covariate Balance Absolute Standardized N (%) N (%) Difference Difference Characteristic rofecoxib nsaid Number of patients 9409 (100.0 %) 9977 (100.0 %) Number of Events While on Therapy 39 (0.4 %) 15 (0.2 %) Person time at risk 59.9 ( 33.3) 46.4 ( 32.5) Patient Characteristics Age 76.3 ( 10.7) 73.1 ( 12.2) 3.2 3.2 60-70 1305 (13.9 %) 1679 (16.8 %) -2.9 -0.082 70-80 3631 (38.6 %) 3883 (38.9 %) -0.3 -0.007 80-90 3179 (33.8 %) 2619 (26.3 %) 7.5 0.164 90-100 580 (6.2 %) 395 (4.0 %) 2.2 0.101 Gender (F) 7764 (82.5 %) 7374 (73.9 %) 8.6 0.208 Recorded use of: Ace Inhibitors 1224 (13.0 %) 1351 (13.5 %) -0.5 -0.016 ARB 567 (6.0 %) 535 (5.4 %) 0.6 0.029 Anticoagulants 548 (5.8 %) 328 (3.3 %) 2.5 0.122 And many more … … … … info@mini-sentinel.org 6

  7. Diagnostics: Balance before/after matching info@mini-sentinel.org 7

  8. Data aggregation across data partner Launch + 3 mos. date _ E E 10 5 D _ DP 1 90 95 D . 100 100 . . 15 10 DP n 185 190 200 200 25 15 275 285 300 300 info@mini-sentinel.org 8

  9. Data aggregation across DPs & over time Launch + 3 mos. + 3 mos. + 3 mos. + 3 mos. + 3 mos. date _ E E 15 10 25 20 50 40 10 5 D _ DP 1 185 190 275 280 450 460 90 95 D . 200 200 300 300 500 500 100 100 . . 30 20 40 30 100 75 15 10 DP n 370 380 560 570 900 925 185 190 400 400 600 600 1000 1000 200 200 45 30 65 50 150 115 25 15 555 570 835 850 1350 1385 275 285 600 600 900 900 1500 1500 300 300 info@mini-sentinel.org 9

  10. Asynchronous database refreshes 10 Launch + 3 mos. + 3 mos. + 3 mos. + 3 mos. + 3 mos. date _ E E 100 75 40 30 50 40 10 5 D _ DP 1 1000 1025 460 470 450 460 90 95 D . 1100 1100 500 500 500 500 100 100 . . 185 135 30 20 40 30 100 75 15 10 DP n 2015 2065 370 380 560 570 900 925 185 190 2200 2200 400 400 600 600 1000 1000 200 200 30 20 80 60 150 115 285 210 25 15 370 380 1350 1385 3015 3090 275 285 1020 1040 400 400 1500 1500 3300 3300 300 300 1100 1100 info@mini-sentinel.org 10

  11. Visualizing heterogeneity 11 Launch + 3 mos. + 3 mos. + 3 mos. + 3 mos. + 3 mos. date _ E E 15 10 25 20 50 40 10 5 D _ DP 1 185 190 275 280 450 460 90 95 D . 200 200 300 300 500 500 100 100 Center . effects . 30 20 40 30 100 75 15 10 DP n 370 380 560 570 900 925 185 190 400 400 600 600 1000 1000 200 200 Heterogeneity by time since marketing info@mini-sentinel.org 11

  12. Data aggregation (Report: Rassen et al.) 10.0 8.0 6.0 4.0 2.0 0.0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 -2.0 -4.0 -6.0 -8.0 -10.0 Rassen JA, et al. Pharmacoepidemiol Drug Saf 2010;19:848-57 info@mini-sentinel.org 12

  13. Aggregation over time … … PS-match … … PS-match … … PS-match … info@mini-sentinel.org 13

  14. Prospective surveillance: alerting Risk estimation Module 1 Module 2 Module 3 Self-controlled Cohort matching Cohort regression Sensitivity analyses Aggregate accumulating results over time Apply alerting rules info@mini-sentinel.org 14

  15. Pre-monitoring activities  Acceptable false positive rate may vary:  Acceptable level of risk  Availability of alt. meds  Severity of event(s)  Expected beneficial effect  Anticipated utilization  Monitoring intervals  Duration of monitoring info@mini-sentinel.org 15

  16. Post-monitoring activities  Sensitivity analyses  Confounding  Exposure risk-window  Incident user definition window  AT vs. ITT  Subgroup analyses as needed  Comprehensive presentation of decision-relevant information info@mini-sentinel.org 16

  17. Prospective surveillance: reporting Newly marketed product Define exposures, outcomes, etc Choose analysis approach Estimate the risk Aggregate results over time Apply alerting rules Report to FDA FDA reports to public when appropriate info@mini-sentinel.org 17

  18. What happens when we find something?  Prompt, pre-planned product-specific assessment of positive signal  Examples of follow-up activities: • Data validity checks, analytic code checks • Adjust for additional confounders • Test against other comparators • Medical chart validation of cases • Quantitative bias analysis • Detailed epidemiologic investigation to assess causality info@mini-sentinel.org 18

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend