systematic review essentials what are they how are they
play

Systematic Review Essentials: What Are They, How Are They Done, and - PowerPoint PPT Presentation

Systematic Review Essentials: What Are They, How Are They Done, and How Are They Useful? Evan R. Myers, MD, MPH Walter L. Thomas Professor, Dept of Obstetrics & Gynecology, Duke University School of Medicine Associate Director, Duke


  1. Systematic Review Essentials: What Are They, How Are They Done, and How Are They Useful? Evan R. Myers, MD, MPH Walter L. Thomas Professor, Dept of Obstetrics & Gynecology, Duke University School of Medicine Associate Director, Duke Evidence-based Practice Center November 1, 2018 #PCORI2018

  2. Evan R. Myers, MD, MPH Disclosures Rela latio ionship ip Company ny(ies es) Speakers Bureau blank Advisory Committee Merck, Inc Consultancy Merck, Inc; Abbvie, Inc; Bayer, Inc.; Allergan, Inc. Review Panel blank Board Membership blank Honorarium blank Ownership Interests blank 2 • November 29, 2018

  3. Objectives At the conclusion of this activity, the participant should be able to: • Define “systematic review” • Outline the steps involved in a systematic review • Describe potential uses of a systematic review 3 • November 29, 2018

  4. What is a Systematic Review? 4 • November 29, 2018

  5. Systematic vs Narrative Reviews 5 • November 29, 2018

  6. Definitions • Systematic review (literature synthesis) = summary of scientific evidence about a specific question • Meta-analysis = technique for combining data quantitatively from multiple studies • Network meta-analysis = combines data from direct and indirect comparisons • Interventions A, B, C • Direct comparisons of A and B, B and C • Network meta-analysis estimates comparison between A and C • Patient level meta-analysis = combines patient level data from multiple studies 6 • November 29, 2018

  7. Narrative Reviews Systematic Reviews Simulation Narrative Reviews Model (Decision analysis, cost- effectiveness analysis, etc) Meta-analyses

  8. When Should a Systematic Review Be Done? • Several studies addressing the same question and using similar methods • Yield different results • Lack the power to detect a clinically important or statistically significant result • Uncommon but important outcomes (e.g., complications) • To inform a clinical guideline or policy (including research prioritization) • Background work for major research projects 8 • November 29, 2018

  9. How is a Systematic Review Done? 9 • November 29, 2018

  10. Make Sure Someone Hasn’t Already Done One! • Check these databases: • Cochrane • DARE ( Database of Abstracts of Reviews of Effectiveness) • Health Technology Assessments • PROSPERO • Clinical Evidence - http://www.clinicalevidence.org 10 • November 29, 2018

  11. Steps in a Systematic Review • Topic Scoping-formulating the question(s) • Searching the Evidence • Sources and search strategy • Article selection: eligibility criteria • Abstracting Data • Synthesizing Data • Summarize Results 11 • November 29, 2018

  12. Topic Scoping: Challenges • Key questions guide the entire systematic review process • Must be clear, precise, and relevant to stakeholders • Need to understand the topic before posing key questions • Use of stakeholders/key informants to provide context and ensure relevancy and transparency • PICOTS • Patients, Interventions, Comparators, Outcomes, Timing, Setting 12 • November 29, 2018

  13. Searching the Evidence: Databases • MEDLINE - bibliographic and abstract coverage of biomedical literature (Pubmed or Ovid) • CINAHL - Cumulative Index to Nursing & Allied Health (through Ovid) • PsychInfo - psychology and related disciplines (through Ovid) • Embase – for European viewpoints and drug trials (See Scopus on Duke Library Databases) • Cochrane Controlled Trials Registry – hand searched (through Wiley Interscience) • International Pharmaceutical Abstracts - worldwide coverage of pharmaceutical science and health related literature • Meta-Registry of controlled trials - http://www.isrctn.com/page/mrct; NIH RePORT - https://report.nih.gov/; WHO - http:///www.who.int/ictrp/en • Grey literature: FDA database, Google Scholar, LexisNexis 13 • November 29, 2018

  14. Searching the Evidence: Strategy • Eligibility Criteria • PICOTS • Basic study design • Restrictions due to: • sample size, country, language, publication years, completeness of information (e.g., full publication vs abstract) 14 • November 29, 2018

  15. Searching the Evidence: Screening • Title/Abstract Review • Full Text Review • 2 reviewers need to agree 15 • November 29, 2018

  16. Prevention of Stroke in Atrial Fibrillation: Literature Flow 11,274 citations identified by literature search: PubMed: 6,860 2,446 duplicates Cochrane: 22 Embase: 4,392 Citations identified through gray lit/manual searching: 15 8,843 citations identified 7321 abstracts excluded 1,522 passed abstract 1,300 articles excluded: screening - Not a full publication, publication retracted/withdrawn, full text not obtainable, or full text not obtainable in English: 85 - Does not meet study design or sample size requirements: 132 - Does not meet study population requirements: 646 222 articles - Does not meet tool/intervention or comparator passed full-text screening requirements: 330 - Does not include outcomes of interest: 107 Articles from re-screening of 2013 report that were originally excluded for no outcomes/interventions of interest, but meet the update criteria: 2 articles 224 articles representing 122 studies* were abstracted: KQ1: 45 articles (25 studies) KQ2: 34 articles (18 studies) KQ3: 168 articles (92 studies) 2013 SR: 96 articles representing 63 abstracted studies* ☨ 2018 and 2013 merged 320 articles,185 abstracted studies*: KQ1: 83 articles (61 studies) KQ2: 57 articles (38 studies) KQ3: 220 articles (117 studies) * There are articles/studies that are relevant to more than one KQ. ☨ There are 18 articles representing 9 studies that provided additional outcome data that had not been included in our prior SR.

  17. Searching the Evidence: Challenges • Desire to maximize the likelihood of capturing all of the evidence and minimize effects of reporting biases • Limitations by publication date and/or language • Search terms which are broad enough to capture all relevant data – yet narrow enough to minimize extraneous literature • Inclusion of multiple databases • The advantages/disadvantages of searching the gray literature • Desire to minimize publication bias 17 • November 29, 2018

  18. Abstracting Data • Each included article abstracted for specific relevant data • Entered into standard form • Over-read by 2 nd reader 18 • November 29, 2018

  19. Abstracting Data--Challenges • Challenges in developing forms before data extraction is underway • Lack of uniformity among abstractors • Problems in data reporting • Inconsistencies or missing information in published papers • Data reported in graphs • Publications with at least partially overlapping patient subgroups • Changes to eligibility criteria or methods made between protocol and review • Abstractor burden 19 • November 29, 2018

  20. Synthesizing Data • Is quantitative synthesis appropriate? • If so, which method? • Rating strength of evidence • Risk of bias • Precision • Applicability 20 • November 29, 2018

  21. Reporting Findings • Results and Applicability (particularly in context of guidelines/policy) • Summary results (OR, RR, Mean diff,SMD) • Precision (confidence interval) • Generalizability-feasibility • Range of outcomes considered • Trade-offs between benefits & harms considering values 21 • November 29, 2018

  22. Reporting Findings: Challenges • Use PRISMA to help improve quality of review reporting • Need for consistent messages across conclusions, discussion, and implications for practice and research • Should convey in a transparent manner the methods, results, and implications of findings to diverse readers • Should allow readers to judge the validity of the review 22 • November 29, 2018

  23. Interpreting Findings: Challenges/Caveats • Combining data from diverse study designs • Lack of quantitative synthesis • Prominence to findings from ineligible studies, or extrapolation of positive results from other reviews • Dealing with imprecision and inconsistency in findings • Size of available studies • Diversity of comparisons and outcomes • Conflict between lack of evidence – and desire to provide guidance 23 • November 29, 2018

  24. How are Systematic Reviews Useful? • Guidelines • Particularly when used with formal framework such as GRADE or USPSTF • Strength of evidence helps with judgment about certainty/confidence in recommendation • When guidelines differ, can help with understanding degree to which differences are due to differences in judgments about the strength of evidence VS differences in values • Screening 24 • November 29, 2018

  25. How are Systematic Reviews Useful? • Identifying Research Gaps for Prioritization • Areas of greatest uncertainty should be highest priority for future research • Potential for formal quantitative methods to help identify priority areas (value of information) • Formal evaluation of strength of evidence can help identify whether uncertainty due to lack of precision (we need 1 more RCT) vs bias (we need ANY RCT) • Uterine fibroids 25 • November 29, 2018

  26. How are Systematic Reviews Useful? • Methods Development • Approaches developed to help with formal synthesis may have other applications • Simulation models (e.g., cervical cancer screening) 26 • November 29, 2018

  27. Questions? 27 • November 29, 2018

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend