Data Mary E. Edgerton, MD, PhD Vice-Chair, CAP Pathology Electronic - - PowerPoint PPT Presentation

data
SMART_READER_LITE
LIVE PREVIEW

Data Mary E. Edgerton, MD, PhD Vice-Chair, CAP Pathology Electronic - - PowerPoint PPT Presentation

The CAP Cancer Protocols: Using Data Standards and Minimum Data Sets to Insure Interoperability and Insure Quality of Cancer Diagnostic Data Mary E. Edgerton, MD, PhD Vice-Chair, CAP Pathology Electronic Reporting Committee (PERT) UT MD


slide-1
SLIDE 1

The CAP Cancer Protocols: Using Data Standards and Minimum Data Sets to Insure Interoperability and Insure Quality of Cancer Diagnostic Data

Mary E. Edgerton, MD, PhD Vice-Chair, CAP Pathology Electronic Reporting Committee (PERT) UT MD Anderson Cancer Center

slide-2
SLIDE 2

Data Issues in Cancer Research

  • Data is not shared
  • Data is not interoperable
  • EHR data is text based, not discretized
  • Blue Ribbon Panel Analysis for Moonshots
  • Build a national cancer data ecosystem

Create a national ecosystem for sharing and analyzing cancer data so that researchers, clinicians and patients will be able to contribute data, which will facilitate efficient data analysis.

slide-3
SLIDE 3

Where do we start?

  • A cancer event timeline begins with tissue diagnosis
  • Pathologic (tissue) diagnosis contains a compilation of phenotypic features

that impart information about the tumor

  • Site/organ of origin: Implies organ based differentiation of aberrant cell
  • Histologic type: Relates to cell of origin
  • Behavior: Malignant vs not
  • Grade: Degree of aggressiveness
  • Stage: Extent of disease at time of diagnosis (and at subsequent timepoints)
  • Biomarkers: Refinement of phenotype with molecular information
  • Genomics: Precise information about the genotype of the tumor
slide-4
SLIDE 4

Patient/Disease Attributes Start with Pathology

  • College of American Pathologists (CAP) Cancer Protocols
  • Minimum data set to define a cancer diagnosis-first release 1998
  • Evolution to synoptic report
  • Evolution to machine readable content consisting of uniquely coded data elements

paired to data values

  • Can use c-key to identify individual concepts (data elements) paired with c-key for

individual values such that a composite gives a unique identifier for question|result or SNOMED codes

  • Synoptic report with minimum set of required data elements now required by

American College of Surgeons (ACOS) Commission on Cancer (CoC)and by College of American Pathologists (CAP) for accreditation

  • Not required to be in machine readable format
  • Biomarkers, genomic attributes not included in requirement
  • Biopsy reports not required to have synopsis
slide-5
SLIDE 5

How Is Report Data Be Accessed

  • Data can be input using forms with queries and controlled values for

selection and stored as discretized data

  • eCC’s are currently in XML, in transition to SDC (structured data capture)
  • Depending on how the individual LIMS vendor stores the data, it can be queried and

exported

  • Data can be encoded with SNOMED-CT codes
  • Natural Language Processing (NLP)
  • In general pathology reports are semi structured into sections, e.g. DIAGNOSIS, and

there is a syntax for reporting that can be used to train an NLP engine

  • Synoptic reports in question: answer format can be parsed
  • Report data is transformed via human interpretation and re-stated in clinical

note in free text format

slide-6
SLIDE 6

Preferred Format-eCC

  • Use of electronic based protocols
  • Can package in HL7 text message, send, and parse at consumption node
  • Evolution to SDC with full interoperability
  • Can query local database
slide-7
SLIDE 7

Why SDC and What Is It?

  • The SDC project was initiated by the Office of the National Coordinator for

Health Information Technology (ONC) in early 2013 through its Standards and Interoperability (S&I) Framework initiative.

  • SDC’s technical workgroups have focused on defining standards by which

interoperable forms are defined, rendered, populated and exchanged.

  • The SDC project was developed in cooperation with Integrating the

Healthcare Enterprise (IHE) , a standards organization which focuses on the interoperability of healthcare IT systems, with a focus on combining constrained standards into profiles for interoperable data transmission.

  • IHE gathers case requirements, identifies available standards, and develops technical

guidelines which technical professionals can implement. IHE also hosts yearly “Connectathons” and stages “interoperability showcases” at HIMSS in which vendors assemble to demonstrate the interoperability of their products.

slide-8
SLIDE 8

How Does It Work?

  • Discretized data is collected via forms with queries and (hopefully) choices

for data value selections

  • SDC standardizes the definition of data items of a Data Entry Form (DEF)

inside a Form Design File (FDF).

  • An FDF is an XML description of the data items in a DEF. It is not dependent
  • n the programming language used to create a DEF. In other words, the FDF

is technology-agnostic.

slide-9
SLIDE 9
slide-10
SLIDE 10
slide-11
SLIDE 11

Interoperability

  • CAP PERT Committee participates in Connectathons yearly and has

demonstrated interoperability of SDC based electronic CAP cancer protocols

slide-12
SLIDE 12

Adoptions and Use of eCC’s

  • Cancer Care Ontario
  • California Cancer Registry
slide-13
SLIDE 13

Cancer Care Ontario

  • Since 2004 Cancer Care Ontario has evolved from narrative text to synoptic

reporting with use of discretized data elements

slide-14
SLIDE 14

Proportion of Ontario Hospitals Reporting Cancer Pathology to Cancer Care Ontario, by Level of Standardization, from Narrative to Synoptic Format REPORTING LEVEL LEVEL 1 LEVEL 2 LEVEL 3 LEVEL 4 LEVEL 5 LEVEL 6 DESCRIPTION Narrative No CAP content Single text field data Narrative No CAP content Single text field data Level 2+ Synoptic-like structured format Level 3+ Electronic reporting tools using drop-down menus Level 4+ Standardized reporting language Data elements stored in discrete data fields Level 5+ Common data and messaging standards with ckeys, SNOMED CT or other encoding % ONTARIO HOSPITALS 2004 - 05 5% 40% 50% 5% 0% 0% % ONTARIO HOSPITALS 2006 - 07 0% 5% 70% 25% 0% 0% % ONTARIO HOSPITALS 2008 - 09 0% 0% 65% 17% 18% 0% % ONTARIO HOSPITALS 2009 -10 0% 0% 20% 2% 78% 0% % ONTARIO HOSPITALS JANUARY 2012 0% 0% 8% 0% 0% 92% % ONTARIO HOSPITALS MAY 2012 0% 0% 3% 0% 0% 97% % ONTARIO HOSPITALS OCTOBER 2015 0% 0% 0% 0% 0% 100%

slide-15
SLIDE 15

So as of now CCO

  • 100% Level 6, which is
  • Common data and messaging standards with c-keys, SNOMED CT or other encoding
slide-16
SLIDE 16
  • Allows for much more than direct patient care & surveillance, including:
  • Analytics
  • Quality improvement
  • Change management
  • Performance analysis
  • Enabling action to affect patient outcomes

Real-time cancer patient data

slide-17
SLIDE 17
  • Surgical Resection Positive Margins

§ Provide feedback to surgeons with higher than expected positive margin rates

  • Lymph Node Retrieval Rates

§ Increase retrieval rate by pathologists / pathology assistants

  • Frequency of cases meeting ASCO/CAP cold ischemia and fixation times

§ Feedback provided to surgery or radiology on need for pre-analytic data

  • Correlation of hormone receptor positivity with histologic type

Analysis of structured data can indicate need for action to improve patient care

slide-18
SLIDE 18
  • Data input was in real-time, requiring little human input
  • Data analytics were in place
  • Governance was in place to provide access for providers, consumers, and

payors to PHI-free data

The US Cancer Registries could be a rich data resource if…

slide-19
SLIDE 19

California Cancer Registry

  • Ongoing project to automate case creation in the registry
slide-20
SLIDE 20

CCR History

  • Case Creation, Validation, Aggregation
  • Prior to 2018, operations rely upon manual abstraction
  • Data set is not considered research ready or “complete” until 18-24 months after the

date of diagnosis for any case

  • Pathology (ePath) problematic in the case identification and population

process

  • 8-13K reports processed manually per month to identify unreported cancer cases
  • ePath reports processed manually at the end of the data aggregation process
  • Manual intervention required to determine if a report is an actual cancer case
  • 40%-50% of reports are deleted upon initial review (not cancer)
  • Narrative text is delaying the ability of the CCR to operate as a real-time

surveillance registry and provide additional value to the citizens of CA

slide-21
SLIDE 21

Move to Automate Case Creation

  • Follow lead of CCO-pathologists submit eCC’s, parse at consumption
  • For pathologists not using eCC’s, use NLP to process reports
  • Provide portal for pathologists to input data
slide-22
SLIDE 22

Pilot Project: St. Joseph Health (SJH)

  • Integrated Catholic health care delivery system
  • Organized into three regions

§ NorCal – Eureka, Santa Rosa, Queen of the Valley § SoCal – Orange, St. Jude, Mission, Laguna Beach, St. Mary’s § TX/ NM - Covenant

  • 14 acute care hospitals, home health agencies, hospice care, outpatient services,

skilled nursing facilities, community clinics, and physician organizations.

  • 9 laboratories using the CAP eCC
  • 48 pathologists
slide-23
SLIDE 23

Key factors for SJH participating

  • SJH already using CAP eCC through mTuitive / Meditech
  • Pathologist buy in & champion identified
  • Worked with project management on adding this to busy schedule
  • Executive support obtained
  • Contracts
  • Project justification development & submission
  • Mitigating costs of project

§ Funding provided by California Department of Public Health to offset costs

slide-24
SLIDE 24

Project nuts and bolts

Pathologist signs out cancer report via CAP eCC in LIS Report data saved as structured (discrete) data Report data transformed by vendor/ LIS into interoperable electronic message Message with structured data automaticall y sent by SJH to CCR Data instantly uploaded into CCR database Data can be tracked, grouped, analyzed, and shared to improve clinical practice

24

slide-25
SLIDE 25

Planning, testing, and implementation

  • Planning & site evaluation – 3+ months
  • Time for iterative testing - 3 months
  • Implementation - 1+ month
  • Quality Measurement

§ Pathology reports approach 100% for having all required elements

  • Automation

§ Messages including pathology report data automatically transmitted daily to the

California Cancer Registry from SJH

25

slide-26
SLIDE 26

SJH reporting – Basic statistics

  • First 4 months at first live site in Northern California

§ 16 different template types (eCCs) used § 193 reports received by CCR § Most used: Invasive carcinoma of the breast - 73 reports § Biomarker reports submitted

  • First month at higher population density site in LA Basin

§ 28 different template types (eCCs) used § 92 reports received by CCR § Most used: Invasive carcinoma of the breast - 20 reports § Biomarker reports submitted

26

slide-27
SLIDE 27

SJH reporting practices - Ongoing

  • St. Joseph North and South (10 facilities)
  • Direct to CCR database

§ 30 - 40 reports/ week from North § 80 - 100 reports/ week from South

  • Working towards auto-population of cancer abstracting software at the SJH local

cancer registry

§ Ease the burden of reporting, operational improvement at the local registry level to report

and collect cancer case information

27

slide-28
SLIDE 28

SJH practice challenges

  • Initial complaints about report length (not all sites)
  • Issues with margin section terminology

§ For one facility, surgeons used different nomenclature § PERT Committee working to improve data collection questions/forms

  • Labs had to adjust to not having their own interpretation of the guidelines
  • Change, change, change…

28

slide-29
SLIDE 29

Required Electronic Submission

  • State of California passed legislation to require electronic submission of

path reports to CCR

  • Does not address format, only that submission be electronic
  • Institutions not submitting electronically can be subject to charges form state for

processing their data manually

slide-30
SLIDE 30

Report Structures

  • As of now, about 5-15% of reports are submitted discretized in eCC/XML

format

  • 5% were paper (may need portal)
  • Remainder are in narrative form with or without synoptic report
slide-31
SLIDE 31

NLP

  • CCR contracted with Health Language Analytics (HLA) in pilot project to develop

NLP software to determine

  • Cancer vs non-cancer
  • Abstraction of Site, Behavior, Histology, and Grade
  • Pilot complete
  • Currently in use-approximately 70+% of reports are processed without manual intervention
  • Certain sites problematic
  • Head and Neck
  • GU (ovarian)
slide-32
SLIDE 32

Long Term Vision: CCR establishing new partnerships

  • Facilitate Registry and Clinical/Research Partnerships
  • Historical processes rely upon Researcher hypothesis
  • CCR plans to leverage Business Intelligence against Data Warehouse
  • CCR outputs possibly trends or anomalies, possible areas of interest to researchers
  • CCR would like to facilitate use of the CCR Data Set
  • Patients, Physicians, Reporting facility, eventually Patient access to data
  • Ability to leverage reporting facility data analytic capabilities
  • Translate data analytics into actionable items affecting patient care
  • CCR would like to share the experience across the Registry and other outcomes data

collection activities (CancerLinQ, Roche-FlatIron, etc.)

slide-33
SLIDE 33

CCR providing value to CA Physicians, Facility Groups, County Health

  • Providing reports/data set to physicians diagnosing or treating cancer
  • Pathologists, Oncologists, Primary Care
  • Facility Groups
  • Statistical Analysis on reported data back to Facility Groups
  • County Health
  • Statistical Analysis on reported data back to County Health
slide-34
SLIDE 34

CCR providing value to CA Cancer Patients

  • Access for patients to CCR data
  • “Where do I go when I have X type of cancer?”
  • “How many number of times this year has X type of cancer been treated by Z

institution where I am currently being seen”?

  • “Where has my X type of cancer been diagnosed and treated the most in CA in the past

year”?

  • “Where has my X type of cancer in Y stage had the best survival outcomes over the last

5-10 years?”

  • Can and should be built into a Real-Time Cancer Registry Surveillance Model
slide-35
SLIDE 35

CAP Providing Value

  • Advantages to using eCC’s are 1) straightforward parsing without processing

and 2)interoperability across EHR/Pathology LIMS systems (consider digital pathology and review of outside cases)

  • CAP is looking at extending the cancer protocols to include molecular data.
  • Biomarkers have standards for data requirements via ASCO/CP recommendations but

not specifically required data elements and form structure

slide-36
SLIDE 36

Q&A