rethinking evidence synthesis
play

Rethinking Evidence Synthesis Prof Enrico Coiera Director, Centre - PowerPoint PPT Presentation

Rethinking Evidence Synthesis Prof Enrico Coiera Director, Centre for Health Informatics Australian Institute of Health Innovation Macquarie University Sydney, Australia Variation in care is high Caretrack study found 57% of Australians


  1. Rethinking Evidence Synthesis Prof Enrico Coiera Director, Centre for Health Informatics Australian Institute of Health Innovation Macquarie University Sydney, Australia

  2. Variation in care is high • Caretrack study found 57% of Australians receive care in line with level 1 evidence or consensus guidelines ( Med J Aust 2012; 197 (2): 100-105.) • Causes for practice variation include: • Patient specific needs e.g. co-morbidity • Patient preferences • Clinician preferences • Working with out of date evidence

  3. Evidence synthesis is slow In Australia we don’t always deliver the care that guidelines and experts agree 22794056 on as appropriate. 17638714 Systematic reviews could be updated as soon as a new study results are available (this means we need to do the right trials). Systematic reviews can take years to complete and are extremely resource-intensive, so many are out of date – some as soon as they are published. 20644625 3

  4. Clinical evidence is often biased 24411643 Due to biases in the design, undertaking, reporting, and synthesis in clinical research, about 85% of it is wasted. 20679560 Trials that are funded by industry are less likely to be published within 2 years, and when they are, they are more likely to have favourable results. 23861749 When trials are published, some outcomes are incompletely reported or not reported at all. Safety outcomes are affected more than efficacy outcomes. When reviewers and systematic reviewers synthesise the 25285542 results from many clinical studies, those with financial conflicts of interest are more likely to report favourably. 4

  5. RCTs and guidelines have limitations • They do not represent real-world populations: • Co-morbidities are excluded • May be highly geographically localized introducing biases • Often are too small to detect small effect sizes and too short to detect long-term effects. • Patients have their own preferences once benefits and harms are explained.

  6. Panel Dr. Guy Tsafnat “ The automation of evidence summarisation” Leads the Computable Evidence Lab which is dedicated to automation and optimisation of evidence based medicine Dr. Julian Elliott “ Combining human effort and machines ” Head of Clinical Research at Alfred Hospital and Monash University and Sr Researcher at Australasian Cochrane Centre Dr. Adam Dunn “ When biases in evidence synthesis lead to harm or waste ” Leads the Computational Epidemiology Lab, monitoring biases in the design, reporting, & synthesis of clinical trials Dr. Blanca Gallego-Luxan “ Learning from ‘ patients like mine’ ” Leads the Health Analytics Lab, designing, analysing and developing models derived from complex empirical data MQ | AIHI I CHI 6

  7. Panel Dr. Guy Tsafnat “ The automation of evidence summarisation” Leads the Computable Evidence Lab which is dedicated to automation and optimisation of evidence based medicine Dr. Julian Elliott “ Combining human effort and machines ” Head of Clinical Research at Alfred Hospital and Monash University and Sr Researcher at Australasian Cochrane Centre Dr. Adam Dunn “ When biases in evidence synthesis lead to harm or waste ” Leads the Computational Epidemiology Lab, monitoring biases in the design, reporting, & synthesis of clinical trials Dr. Blanca Gallego-Luxan “ Learning from ‘ patients like mine’ ” Leads the Health Analytics Lab, designing, analysing and developing models derived from complex empirical data MQ | AIHI I CHI 7

  8. Systematic Reviews A robust model for evidence based medicine * Tsafnat, Glasziou, Choong, et al. Sys Rev 3:74 2014 AIHI I CHI I CEL 8

  9. The Manual Process * Tsafnat, Glasziou, Dunn, Coiera The BMJ , 346:f139, 2013 Preparation Retrieval Appraisal Synthesis Meta-Analysis Write-up

  10. Automation * Tsafnat, Glasziou, Dunn, Coiera The BMJ , 346:f139, 2013 Preparation Retrieval Appraisal Synthesis Meta-Analysis Write-up

  11. Automation * Tsafnat, Glasziou, Dunn, Coiera The BMJ , 346:f139, 2013

  12. Clinical Queries AIHI I CHI I CEL 12 Guidelines

  13. Search Automation Saved strategies AIHI I CHI I CEL 13

  14. Citation Networks *Robinson, Dunn, Tsafnat, Glasziou, Journal of Clinical Epidemiology 67(7) 2014 AIHI I CHI I CEL 14

  15. Information Extraction from Trials * Kiritchenko et al., BMC Med Inform Decis Mak , 10, 2010 (text from Kawamura et al. Dev med child neuro 49, 2007) This study compare the effects of low and high doses of botulinum toxin A (BTX-A) to improve upper extremity function. Thirty-nine children (22 males, 17 females) with a mean age of 6 years 2 months (SD 2y 9mo) diagnosed with spastic hemiplegia or triplegia were enrolled into this double-blind, randomized controlled trial. The high-dose group received BTX-A in the following doses: biceps 2U/kg, brachioradialis 1.5U/kg, common flexor origin 3U/kg, pronator teres 1.5 U/kg, and adductor/opponens pollicis 0.6U/kg to a maximum of 20U. (from Kawamura et al. Dev med child neuro 49, 2007) The low-dose group received 50% of this dosage. Outcomes were measured at baseline and at 1 and 3 months after injection, and results were analyzed with a repeated-measures analysis of variance. AIHI I CHI I CEL 15

  16. Panel Dr. Guy Tsafnat “ The automation of evidence summarisation” Leads the Computable Evidence Lab which is dedicated to automation and optimisation of evidence based medicine Dr. Julian Elliott “ Combining human effort and machines ” Head of Clinical Research at Alfred Hospital and Monash University and Sr Researcher at Australasian Cochrane Centre Dr. Adam Dunn “ When biases in evidence synthesis lead to harm or waste ” Leads the Computational Epidemiology Lab, monitoring biases in the design, reporting, & synthesis of clinical trials Dr. Blanca Gallego-Luxan “ Learning from ‘ patients like mine’ ” Leads the Health Analytics Lab, designing, analysing and developing models derived from complex empirical data MQ | AIHI I CHI 18

  17. Julian’s video goes here 19

  18. Panel Dr. Guy Tsafnat “ The automation of evidence summarisation” Leads the Computable Evidence Lab which is dedicated to automation and optimisation of evidence based medicine Dr. Julian Elliott “ Combining human effort and machines ” Head of Clinical Research at Alfred Hospital and Monash University and Sr Researcher at Australasian Cochrane Centre Dr. Adam Dunn “ When biases in evidence synthesis lead to harm or waste ” Leads the Computational Epidemiology Lab, monitoring biases in the design, reporting, & synthesis of clinical trials Dr. Blanca Gallego-Luxan “ Learning from ‘ patients like mine’ ” Leads the Health Analytics Lab, designing, analysing and developing models derived from complex empirical data MQ | AIHI I CHI 20

  19. CENTRE FOR HEALTH INFORMATICS | AUSTRALIAN INSTITUTE OF HEALTH INNOVATION Systematic reviews are fundamentally limited by the quality and transparency of the primary The evidence-practice disconnect evidence on which they are based… Registering clinical trials (2003) 10.1001/jama.290.4.516 Solutions: (a) improve the quality and transparency of the studies that can be included in reviews, or (b) create new forms of evidence synthesis that do not rely on the current ways that clinical studies are reported. 21

  20. CENTRE FOR HEALTH INFORMATICS | AUSTRALIAN INSTITUTE OF HEALTH INNOVATION Synthesis biases: when reviews include evidence selectively or when results and conclusions don’t match. Systematic reviews with COIs produced more favourable conclusions 10.7326/m14-0933 Publication bias: when clinical studies are never published, or published after a long delay. 66% of trials had published results 10.7326/0003-4819-153-3-201008030-00006 Reporting bias: when reports of clinical studies miss or misrepresent parts of what was measured. 40 – 62% of studies had ≥1 primary outcome changed, introduced, omitted. 10.1371/journal.pone.0066844 Design bias: when clinical studies are not designed to answer the right questions at the right times. Industry statin trials used more surrogate outcomes, fewer safety outcomes, were faster. 10.1038/clpt.2011.279 22

  21. CENTRE FOR HEALTH INFORMATICS | AUSTRALIAN INSTITUTE OF HEALTH INNOVATION Sharing of patient-level data: The third movement in the push for completeness and transparency, with pressure on funders/companies – and the technologies they need. A new future for clinical research through data sharing (YODA Project). 10.1001/jama.2013.1299 Linking trial design to practice: Making (post-approval) clinical trials match practice to properly address safety and effectiveness – and the technologies they need. Right answers, wrong questions in clinical research. 10.1126/scitranslmed.3007649 Bigger, better studies using EHRs: Connecting research and practice to fix enrolment and make trials much more efficient – and the technologies they need. A new architecture for connecting clinical research to patients through EHRs. 10.1136/amiajnl-2014-002727 23

  22. CENTRE FOR HEALTH INFORMATICS | AUSTRALIAN INSTITUTE OF HEALTH INNOVATION • Focuses on research into peer review and research integrity • High visibility – permanent, unrestricted, free online access • Highly-respected editorial board • Rapid and thorough peer review Editors-in-Chief: Stephanie Harriman (UK), Maria Kowalczuk (UK) Iveta Simera ( UK ), Elizabeth Wager (UK) www.biomedcentral.com www.researchintegrityjournal.com 24

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend