Systematic Review Essentials: What Are They, How Are They Done, and - - PowerPoint PPT Presentation

systematic review essentials what are they how are they
SMART_READER_LITE
LIVE PREVIEW

Systematic Review Essentials: What Are They, How Are They Done, and - - PowerPoint PPT Presentation

Systematic Review Essentials: What Are They, How Are They Done, and How Are They Useful? Evan R. Myers, MD, MPH Walter L. Thomas Professor, Dept of Obstetrics & Gynecology, Duke University School of Medicine Associate Director, Duke


slide-1
SLIDE 1

Systematic Review Essentials: What Are They, How Are They Done, and How Are They Useful?

Evan R. Myers, MD, MPH

Walter L. Thomas Professor, Dept of Obstetrics & Gynecology, Duke University School of Medicine Associate Director, Duke Evidence-based Practice Center November 1, 2018

#PCORI2018

slide-2
SLIDE 2

2 • November 29, 2018

Evan R. Myers, MD, MPH

Disclosures

Rela latio ionship ip Company ny(ies es) Speakers Bureau blank Advisory Committee Merck, Inc Consultancy Merck, Inc; Abbvie, Inc; Bayer, Inc.; Allergan, Inc. Review Panel blank Board Membership blank Honorarium blank Ownership Interests blank

slide-3
SLIDE 3

3 • November 29, 2018

Objectives

At the conclusion of this activity, the participant should be able to:

  • Define “systematic review”
  • Outline the steps involved in a systematic review
  • Describe potential uses of a systematic review
slide-4
SLIDE 4

4 • November 29, 2018

What is a Systematic Review?

slide-5
SLIDE 5

5 • November 29, 2018

Systematic vs Narrative Reviews

slide-6
SLIDE 6

6 • November 29, 2018

Definitions

  • Systematic review (literature synthesis) = summary of scientific

evidence about a specific question

  • Meta-analysis = technique for combining data quantitatively from

multiple studies

  • Network meta-analysis = combines data from direct and indirect

comparisons

  • Interventions A, B, C
  • Direct comparisons of A and B, B and C
  • Network meta-analysis estimates comparison between A and C
  • Patient level meta-analysis = combines patient level data from multiple

studies

slide-7
SLIDE 7

Narrative Reviews Narrative Reviews Systematic Reviews Meta-analyses

Simulation Model (Decision analysis, cost- effectiveness analysis, etc)

slide-8
SLIDE 8

8 • November 29, 2018

When Should a Systematic Review Be Done?

  • Several studies addressing the same question and using similar

methods

  • Yield different results
  • Lack the power to detect a clinically important or statistically significant

result

  • Uncommon but important outcomes (e.g., complications)
  • To inform a clinical guideline or policy (including research

prioritization)

  • Background work for major research projects
slide-9
SLIDE 9

9 • November 29, 2018

How is a Systematic Review Done?

slide-10
SLIDE 10

10 • November 29, 2018

Make Sure Someone Hasn’t Already Done One!

  • Check these databases:
  • Cochrane
  • DARE (Database of Abstracts of Reviews of Effectiveness)
  • Health Technology Assessments
  • PROSPERO
  • Clinical Evidence - http://www.clinicalevidence.org
slide-11
SLIDE 11

11 • November 29, 2018

Steps in a Systematic Review

  • Topic Scoping-formulating the question(s)
  • Searching the Evidence
  • Sources and search strategy
  • Article selection: eligibility criteria
  • Abstracting Data
  • Synthesizing Data
  • Summarize Results
slide-12
SLIDE 12

12 • November 29, 2018

Topic Scoping: Challenges

  • Key questions guide the entire systematic review process
  • Must be clear, precise, and relevant to stakeholders
  • Need to understand the topic before posing key questions
  • Use of stakeholders/key informants to provide context and

ensure relevancy and transparency

  • PICOTS
  • Patients, Interventions, Comparators, Outcomes, Timing, Setting
slide-13
SLIDE 13

13 • November 29, 2018

Searching the Evidence: Databases

  • MEDLINE - bibliographic and abstract coverage of biomedical literature (Pubmed or

Ovid)

  • CINAHL - Cumulative Index to Nursing & Allied Health (through Ovid)
  • PsychInfo - psychology and related disciplines (through Ovid)
  • Embase – for European viewpoints and drug trials (See Scopus on Duke Library Databases)
  • Cochrane Controlled Trials Registry – hand searched (through Wiley Interscience)
  • International Pharmaceutical Abstracts -worldwide coverage of

pharmaceutical science and health related literature

  • Meta-Registry of controlled trials - http://www.isrctn.com/page/mrct;

NIH RePORT - https://report.nih.gov/; WHO - http:///www.who.int/ictrp/en

  • Grey literature: FDA database, Google Scholar, LexisNexis
slide-14
SLIDE 14

14 • November 29, 2018

Searching the Evidence: Strategy

  • Eligibility Criteria
  • PICOTS
  • Basic study design
  • Restrictions due to:
  • sample size, country, language, publication years, completeness of information

(e.g., full publication vs abstract)

slide-15
SLIDE 15

15 • November 29, 2018

Searching the Evidence: Screening

  • Title/Abstract Review
  • Full Text Review
  • 2 reviewers need to agree
slide-16
SLIDE 16

Prevention of Stroke in Atrial Fibrillation: Literature Flow

11,274 citations identified by literature search: PubMed: 6,860 Cochrane: 22 Embase: 4,392 Citations identified through gray lit/manual searching: 15 2,446 duplicates 8,843 citations identified 7321 abstracts excluded 1,522 passed abstract screening 1,300 articles excluded:

  • Not a full publication, publication

retracted/withdrawn, full text not obtainable, or full text not obtainable in English: 85

  • Does not meet study design or sample size

requirements: 132

  • Does not meet study population requirements: 646
  • Does not meet tool/intervention or comparator

requirements: 330

  • Does not include outcomes of interest: 107

222 articles passed full-text screening 224 articles representing 122 studies* were abstracted: KQ1: 45 articles (25 studies) KQ2: 34 articles (18 studies) KQ3: 168 articles (92 studies) Articles from re-screening of 2013 report that were

  • riginally excluded for no outcomes/interventions
  • f interest, but meet the update criteria:

2 articles * There are articles/studies that are relevant to more than one KQ. ☨There are 18 articles representing 9 studies that provided additional

  • utcome data that had not been included in our prior SR.

2013 SR: 96 articles representing 63 abstracted studies*☨ 2018 and 2013 merged 320 articles,185 abstracted studies*: KQ1: 83 articles (61 studies) KQ2: 57 articles (38 studies) KQ3: 220 articles (117 studies)

slide-17
SLIDE 17

17 • November 29, 2018

Searching the Evidence: Challenges

  • Desire to maximize the likelihood of capturing all of the evidence

and minimize effects of reporting biases

  • Limitations by publication date and/or language
  • Search terms which are broad enough to capture all relevant data

– yet narrow enough to minimize extraneous literature

  • Inclusion of multiple databases
  • The advantages/disadvantages of searching the gray literature
  • Desire to minimize publication bias
slide-18
SLIDE 18

18 • November 29, 2018

Abstracting Data

  • Each included article abstracted for specific relevant data
  • Entered into standard form
  • Over-read by 2nd reader
slide-19
SLIDE 19

19 • November 29, 2018

Abstracting Data--Challenges

  • Challenges in developing forms before data extraction is underway
  • Lack of uniformity among abstractors
  • Problems in data reporting
  • Inconsistencies or missing information in published papers
  • Data reported in graphs
  • Publications with at least partially overlapping patient subgroups
  • Changes to eligibility criteria or methods made between protocol and

review

  • Abstractor burden
slide-20
SLIDE 20

20 • November 29, 2018

Synthesizing Data

  • Is quantitative synthesis appropriate?
  • If so, which method?
  • Rating strength of evidence
  • Risk of bias
  • Precision
  • Applicability
slide-21
SLIDE 21

21 • November 29, 2018

Reporting Findings

  • Results and Applicability (particularly in context of

guidelines/policy)

  • Summary results (OR, RR, Mean diff,SMD)
  • Precision (confidence interval)
  • Generalizability-feasibility
  • Range of outcomes considered
  • Trade-offs between benefits & harms considering values
slide-22
SLIDE 22

22 • November 29, 2018

Reporting Findings: Challenges

  • Use PRISMA to help improve quality of review reporting
  • Need for consistent messages across conclusions, discussion, and

implications for practice and research

  • Should convey in a transparent manner the methods, results, and

implications of findings to diverse readers

  • Should allow readers to judge the validity of the review
slide-23
SLIDE 23

23 • November 29, 2018

Interpreting Findings: Challenges/Caveats

  • Combining data from diverse study designs
  • Lack of quantitative synthesis
  • Prominence to findings from ineligible studies, or extrapolation of

positive results from other reviews

  • Dealing with imprecision and inconsistency in findings
  • Size of available studies
  • Diversity of comparisons and outcomes
  • Conflict between lack of evidence – and desire to provide guidance
slide-24
SLIDE 24

24 • November 29, 2018

How are Systematic Reviews Useful?

  • Guidelines
  • Particularly when used with formal framework such as GRADE or

USPSTF

  • Strength of evidence helps with judgment about certainty/confidence in

recommendation

  • When guidelines differ, can help with understanding degree to which

differences are due to differences in judgments about the strength of evidence VS differences in values

  • Screening
slide-25
SLIDE 25

25 • November 29, 2018

How are Systematic Reviews Useful?

  • Identifying Research Gaps for Prioritization
  • Areas of greatest uncertainty should be highest priority for future

research

  • Potential for formal quantitative methods to help identify priority areas

(value of information)

  • Formal evaluation of strength of evidence can help identify whether

uncertainty due to lack of precision (we need 1 more RCT) vs bias (we need ANY RCT)

  • Uterine fibroids
slide-26
SLIDE 26

26 • November 29, 2018

How are Systematic Reviews Useful?

  • Methods Development
  • Approaches developed to help with formal synthesis may have other

applications

  • Simulation models (e.g., cervical cancer screening)
slide-27
SLIDE 27

27 • November 29, 2018

Questions?

slide-28
SLIDE 28

28 • November 29, 2018

Thank You!

Evan R. Myers, MD, MPH

Walter L. Thomas Professor, Dept of Obstetrics & Gynecology, Duke University School

  • f Medicine

Associate Director, Duke Evidence-based Practice Center

November 1, 2018

slide-29
SLIDE 29

29 • November 29, 2018

Learn More

  • www.pcori.org
  • info@pcori.org
  • #PCORI2018