Difference between revisions of "Clinical Research, Public Health and Quality use of EHR Data"

From IHE Wiki
Jump to navigation Jump to search
Line 60: Line 60:
 
* Emergency responders
 
* Emergency responders
  
== Public Health Workflows and processes ==
+
== Workflows and Processes ==
 
=== Scope ===
 
=== Scope ===
 
Clearly state the scope of what we are addressing in this paper.
 
Clearly state the scope of what we are addressing in this paper.

Revision as of 11:55, 30 May 2007

Scope

In this paper, we are looking for several things. First, for each of our stakeholders, we would like them to describe each of their problem stakeholders, and the tasks and workflows that are executed under the current state of affairs.

Secondly, we will review both the similaries and differences between stakeholders, tasks and workflows from each of the three communititees. This will help define the scope of the problems that we want to solve.

Lastly, we will put forth a proposed plan of work, not just for the PCC domain, but also to other domains, including IT Infrastructure, Laboratory, and the newly formed Quality domain.

Clinical Trials

The initial interest of the CDISC community -- the biopharmaceutical companies who sponsor clinical trials -- was to insert a research protocol into an EHR as an executable piece of workflow. While this bears semblance to other 'case management' use cases, the phrase will not make sense to the clinical trial community.

The embedded image (also included as a Powerpoint link) shows how protocol insertion might work.

Media:Media-ProcessDiagramClinicalTrial_CaseMgt_14Mar07_lb.ppt ClinicalTrial CaseMgt 14Mar07 lb.jpg

Public Health Reporting

What is Public Health

  • Definition of Public Health (functions and services)
  • Organization of Public Health across the world(local, state and federal)

Public Health Roles and Activities

Roles

  • Direct Care providers
  • Scheduled public care (agency)
  • Health Education
  • Research organization [Tim Carney]
  • Preventive care services
  • PHR Perspectives (significance to an infdividual patient) [Dave McCord]
  • Public Health Informatics & Emergency Response [ Jennie Lou]
  • Patient Safety [Xu Wu]

Activities

  • Case Management – what individual does to manage the disease
    • Disease Management – a aggregate analysis of care delivery for a disease entity
      • Chronic Diseases Management (Asthma , Diabetes , COPD, Heart, obesity)
  • Decision Support
  • Bioterrorism Preparedness/surveillance
  • Community Record (creation and maintaince)
  • Maternal and Child Health
  • School Health
  • Environmental health (abatement, asbestos)

Example Systems

HEDIS

  • HEDIS for reporting purposes only
  • HEDIS for clinical practice guidelines
  • HEDIS as annual measure – interim case management/outreach - proactive

Stakeholders

  • Safety-net clinics
  • Healthcare providers (private)
  • Local Health Department
  • State Health Department
  • Other Governmental Agencies (e.g. DOE/CDC/HHS)
  • Other healthcare providers (e.g. hospitals, labs, RX, Nursing home, home health)
  • Medicaid/Social Services case workers
  • Disease Management Departments
  • Emergency responders

Workflows and Processes

Scope

Clearly state the scope of what we are addressing in this paper.

  • data types
  • Send/assess/policy
  • Screening/biosurveillance
  • Send/feedback/clinical care intervention (closed loop process)
    • PH agent
    • Clinical care provider
    • Outbreak/emergency
    • Oversight (e.g. sicklecell – nationwide screening; group with abnormal result; then intervene)

Existing processes and technologies

State (regional) efforts

  • e.g., immunization registries [Alien Kirnak],
  • disease registries (e.g., cancer registries [Sandy Thames],
  • EMS-Trauma Systems [Chris Tilden],
  • public health laboratories,
  • vital registration [Marjorie Greenberg/Michelle Williamson] etc.
  • france?

Interstate (inter-regional) efforts

  • e.g., Great Lakes region interstate collaboration, etc.;

National efforts

  • USA: e.g., EpiInfo, NEDSS, EPHTN, Surveys [Karen Lipkind], etc.
    • Organizations: AHIC, HITSP, HISPC, CCHIT, NHIN prototypes projects, PHIN, Other….
  • (efforts from other nations: Canada?, Germany? Other?),

International Efforts

  • ??

Cooperation with Clinical Care

  • Connecting Communities for Better Health projects
  • AHRQ HIT demonstration projects
  • RWJ InformationLinks projects [Chris Tilden, Dave Ross – to be approached]
  • RWJ Common Ground projects [Chris Tilden, Dave Ross – to be approached]

Interoprability Efforts

  • NYC - Community Health Centers & Health Department
  • PHDSC Interoperability Prototype, 2005
  • IBM Laboratory Demonstration Project
  • Other????

Example (Ideal) Scenario

  • Today – paper based: Workflow for school health scenario
    • Parent requests form vs Provider fills in health form
    • Provider signs health form
    • Provider gives form to parent to bring to school
  • Future: for school health scenario
    • Parent provides RHIO consent for release restrictions to school system (BPPC)
    • Parent provides RHIO consent for release of school health record to system
    • Provider/EHR generates School Health Form including DSG
    • Provider EMR submits school health form to RHIO (DOE?) (Provide and Register)
    • School nurse retrieves health form from RHIO into local school EMR (document Consumer)
    • School nurse EMR generates walkin visit medical summary (XDS-MS)
    • EMR sends medical summary to RHIO (Provide and Register)
    • (School attendance system generates school attendance data)
    • Local public health retrieves Medical data from RHIO school and medical system submissions (document consumer) (?Aggregate Data Retrieval)
    • Local public health analyses data for trend analysis
    • Condition detected through trending (e.g. flu) Case investigation
    • Intervention
      • Communication with parent
      • In-school services to mitigate illness (e.g. vaccination)
      • In-school Education (opportunity?)

Problems

  1. Express the criteria: This is the definition used to determine that a patient qualifies for a particular PH program. The program has a well defined population to serve, though the population may be difficult to identify.
    • Newborn Screening – all newborns born in that state (birth)
    • Newborn Follow-up – newborn with a positive newborn screening test (lab result)
    • Immunization – all children in that state (Medicaid enrollment)
  2. Identify a patient meeting certain criteria (compositional) – patient level:
    • This is the identification of patients that qualify for a particular PH program. How this happens varies greatly within PH. Patient identification comes from referrals that include private and public clinics as well as other agencies. Incoming lists may require human review and further clarifications.
      • Newborn Screening – all birthing centers, safe haven medical facilities
      • Newborn Follow-up – public health laboratory
      • Immunization – private and public clinics, Medicaid enrollment lists
    • Also important in many PH programs is the continued evaluation of patients in the registry that continue to meet the program’s defined population requirements. Ex. Age, Address, Income, Insurance
      • Newborn Follow-up – confirmation of laboratory diagnosis age birth to adult (18) living in state
  3. Reporting data: This is the registry creation of patients that qualify for a particular PH program and interventions provided.
    • Newborn Screening – public health laboratory value, confirmation of laboratory diagnosis, ongoing treatment reports
    • Immunization – immunization given, waived, or refused
  4. Data Review/feedback: This is the processing of the reporting data for compliance, completeness, and accuracy. Compliance is evaluating that the reports expected are received in a timely manner and match to patients known to be included in the program registry. A report received for a patient not in the registry would need to be evaluated for inclusion in the program. Completeness is identifying missing or weak content that needs follow-up. Accuracy is correctness of the data if determinable.
  5. Analysis/Evaluate
  6. Mapping (harmonizing semantics and concepts)
  7. Validation (data integrity, correctness… QA)
  8. Aggregation/reporting (communicate aggregated report – probably different by topic)
  9. Communication (of raw data, of analyzed/aggregated data, of feedback/alerts, Sharing)
    1. Care coordination
    2. Population based
  10. Clarification (data reported – need clarification on meaning of the data – or additional supporting data)
  11. feedback
  12. Iterations
    1. reset
  13. Workflow
    1. Oversight of successful fulfillment of process
  14. Guidance

Simmilarities to Quality and Clinical Trials

  • Identify Cohort/populations fitting a similar criteria
  • Select Cohort
  • Overlaps (e.g. PH/Quality: HCUP/HEDIS, Pharmacovigillence - quality)

Differences with Quality and Clinical Trials

  • E.g. PH Environmental subject of care (water, building…)
  • Privacy Requirements (constraints)

Goals

Collaborative Goals

This section will describe the organizational goals of this joint collaboration (what “people”-related things we want to accomplish).

  1. Engaging public health community in the development of the technical specifications for interoperable clinical and public health systems: IHE Profile Proposals for 2007-2009
    1. Identify public health domain/programs for the development of profiles – more on this in the next section
  2. Educating the community (vendors, public health, etc.) on the new profiles or profile extensions for public health.
  3. Encourage adoption of our joint technical output in the community (both public and private)
    1. Roadmap for public/private collaboration: Building health information exchanges between clinical care and public health
    2. Harmonizing state policies and regulations

Technical Goals

This section will describe the goals related to new/existing technology that needs to be created/employed

  1. Proposed timeline (or prioritization list)
    1. Identify common (public health domain independent) core technologies that need to be established as profiles (or profile extensions, in the case that a profile exists that can satisfy a particular technical goal).
    2. Identify public health domain/programs for the development of profiles
  2. Developing functional requirements and specifications (profiles) for interoperable clinical-public health systems
    1. Data content harmonization across public health systems
      1. Document and message based
    2. Data sharing workflows
      1. Document and message based
    3. Notification/Surveillance
    4. Privacy and security
    5. public health specific domain considerations
      1. ex. Immunization, Reportable Conditions, etc.
    6. etc?
  3. Survey of existing international Standards and IHE profiles – This section will contain a list of standards, existing IHE profiles, etc. that will be under consideration for use in achieving the aforementioned technical goals.
    1. Data content harmonization
      1. Document: HL7 CDA, CEN?, X-forms
      2. HL7 v2.x and v3 messaging
      3. SNOMED, LOINC, CSTE? Other terminologies?
      4. Existing IHE content profiles: XDS-MS, XDS-Lab, XDS-I, XDS-Lab for public health (in progress), etc.
      5. Work done by HITSP?
    2. Data sharing
      1. IHE Laboratory technical framework
      2. XD* - Cross Enterprise Document Sharing
      3. RFD - Retrieve Form for Data Capture (RFD)
      4. QED – query for existing data (in progress)
      5. Work done by HITSP?
    3. Notification
      1. NAV – Notification of Availability
      2. Work done by HITSP?
    4. Privacy and Security
      1. ATNA – Audit trail and node authentication
      2. XUA – Cross-Enterprise user authentication
      3. Work done by HITSP?
    5. public health specific domain considerations
      1. ex. Immunization, Reportable Conditions, etc.
      2. Work done by HITSP?
    6. Etc.


Quality Reporting

Definition of Quality (Safe, Effective, Efficient, Patient-Centered, Timely and Equitable)

Five key areas in which information technology could contribute to an improved health care delivery system:<ref>Institute of Medicine. Crossing the Quality Chasm: A New Health System for the 21st Century. 2001. p 31.</ref>

Access to the medical knowledge-base

Computer-aided decision support systems

Collection and sharing of clinical information

Reduction in errors

Enhanced patient and clinician communication

As summary, the Quality Domain includes aggregate measures of performance as well as individual case reporting of adverse events (including but not limited to Hospital Acquired Infections <HAIs>, adverse drug events <ADEs>, sentinel events, and others).

Quality Vision as defined by AHIC (see Appendix A)

National Quality Enterprise

Stakeholders

Consumers

Providers (Hospitals, ambulatory practices, pharmacies, labs, radiology clinics)

Clinical Practitioners (Physicians / Physician Practices /Ambulatory Practices, Laboratorian, radiologist, case managers, pharmacists, etc)

Employers

Policymakers

Accreditors

Research Community

Vendor of healthcare information systems

Performance Measure Development Organizations

Performance Measure Endorsers/ Approvers

Performance Measure Adopter (Payors, etc)

Clearinghouse/ Outsourced Measure Calculator/ Benchmarking Service

Performance Measure Implementers/ Receivers

Defining Characteristics of the Healthcare System with Respect to the Quality Enterprise

Receiving Care (includes consumer education/decision support)

Delivering Encounter-based Care (includes practitioner education/decision support)

Managing Health of Defined Populations (Cohort)

Coordination of Care (for 1 person across care venues)

Improving Quality

Measuring and Reporting Quality

Reimbursement (e.g. pay-for-performance)

Accreditation

Certification

Defining Characteristics of the National Quality Infrastructure

Metrics

EHR Products

EHR Adoption

Data Stewardship

Data Aggregation

Population Reporting and Feedback

Public Reporting

Health Information Exchange and Intermediaries

Privacy and Security, Secondary Uses of Data

Medical Education

Knowledge products

Enablers and Barriers

Enablers

  • Clear value proposition supports the use of HIT capabilities for quality assessment, quality improvement and informed decision making
  • Collaboration between providers, purchasers, consumers and accreditors/oversight bodies and professional certification entities produce uniform standards for sharing and aggregating health data and for public reporting
  • Collaboration between regional quality measurement initiatives and RHIOs or NHIN service providers
  • Standard approach for EHRs to routinely produce quality data based on approved measures that span care delivery
  • Designation of a national health data stewardship entity to oversee appropriate use of data
  • Comprehensive medical record across points of care obtained via health information exchange networks to enable intelligent alerts to providers
  • Measure developers identify data and HIT requirements in order to implement measures into clinical care and software
  • Certification of HIT based on criteria to enable reporting of an expanded set of AQA and HQA quality measurement in EHRs
  • Education of consumers on how to obtain data and assess quality of care along with sharing of data with patients' PHRs will increase consumer stake in quality measurement
  • Overall payment system that provides incentives for quality and safe care
  • Cultural change that encourages performance reporting
  • Certification of clinical decision support capabilities in EHRs
  • Additional pilot projects that provide leadership for a national framework and act as learning laboratories to link public and private data sets and assess clinical quality, cost of care and patient experience
  • Personal Health Records

Barriers

  • Lack of a clear business model for health information exchange
  • Lack of a clear business model for quality
  • Limited set of national consensus measures; robust measures not yet developed for all physician specialties
  • Lack of standards for data collection and aggregation
  • Lack of standardized mechanisms for external reporting including data stewardship
  • Lack of alignment of payment with quality performance
  • Gaps in regulations and practices relating to privacy/security and secondary use of data
  • Slow translation of research into practice at the point of care
  • Quality assessment tightly linked with site of care or individual clinicians; few integrated or episode-based metrics
  • Lack of coordination in quality measurement
  • Gaps in quality management capabilities of EHRs
  • Clinical documentation unstructured using non-standardized nomenclature
  • HIE operational in few regions
  • Poor provider economics- higher costs of doing business, declining reimbursement and the expectation of implementing information technology solutions
  • Lack of a complete medical record to support CDS (Clinical Decision Support)
  • Reluctance to share data
  • PHR adoption
  • EHR adoption

Organization of Quality (local, state and federal)[May be adequately defined in the components above]

Today

manual extract from or chart review EHR

often doesn't happen

claims reporting (e.g. PQRI (physician quality reporting initiative))

credentialing and accrediting organizations (responsible for selecting measures).

There are multiple concurrent credentialing and accrediting activities for the same practitioners. (e.g. Payor accreditation, state licensing requirements for physician groups include quality measures that could be conflicting or have multiple reporting requirements)

Measured developing organizations work with these bodies to create the measures

Proposed Workflow

Indicates goes to vendor short-term

longer-term – EHR incorporates directly

(insert diagram)

Quality Reporting Consumer

[May be informed by or replaced by the content in the Stakeholder section from the AHIC table as indicated above.]

Payers/ Insurers,

Regulators and policy makers (e.g. state agencies, federal regulators, etc)

Credentialing and accrediting organizations (responsible for selecting measures)

Measured developing organizations work with these bodied to create the measures (e.g. 3M that sells sw to manage performance measures, demanders of healthcare measures-payers, facilities that want to benchmark themselves regarding quality of care.

Provider Organization

Decision support

This section encompasses internal decision support capabilities for concurrent (or interactive) decision support which may be managed through multiple means, e.g., order sets, protocols, care plans, aggregated alerts and reminders, individual alerts and reminders. Management of concurrent decision support should be enabled by quality measurement and reporting elements defined externally or locally. Methodologies for attaining successful quality outcomes require careful attention to local practice and workflow and therefore such methodologies should not be proscriptive.

Continuous Quality Improvement (CQI)

This section refers to capability to compare dynamically, in near real-time, and retrospectively, performance with respect to structural, process and outcome measures of quality, whether defined and transmitted from an external stakeholder or identified locally. Methodologies for attaining quality outcomes require alignment with local workflow and practice and therefore such methodologies should not be proscriptive. The site for aggregation of data for such CQI practices can be inherent within the EHR or external to the EHR either at the practice location or managed by a third party data warehouse organization or vendor. Such data management tactics must be determined by the healthcare organization, yet enabled by the input and export for the quality domain.

External reporting of quality measures for

accountability
pay for performance
contractual

Policy issues impacting Quality measures

Medical home

Concern for specialty or primary care provided by multiple practitioners – who is accountable

Provider responsibility assignment, vs payor assignment may differ

Referred providers and practitioners (e.g. specialty providers, radiology center)

Attribution

to individual practitioner

to process

to practice/clinic/hospital

health plan

provider networks

e.g comorbidity/mortality post-surgical; score by MD, need to risk adjust, and consider process factors may not be accounted for in the physician scorecard; quality measure/reporting should be mindful of reporting and measures at appropriate levels

Quality of measures

Inconsistency of measures

cohort definition

attribution

metric

Measure standards

different insurers requiring different measure and care delivery criteria

similar process measures used for attribution to different care providers

Types of quality measures

Structural

EMR certified for decision support

Using EMR for e-prescribing

Outcome

return to work

functional status improvement

Process

procedure done

lab order

test result improvement

Relevance to target audience

Quality considerations in a RHIO – vision statement

Electronic Data Capture from EMR

Within as part of workflow

From outside of the usual clinical workflow

Secondary Use of Shared Document Resource

Query

Publish and Subscribe

Payload Content

Message

source data
summary data

Document

source data
summary data

Filter Data

Numerator
Those in the cohort that receive the care/process expected
Describes case definition for the population in question
Denominator
The cohort (case definition)
Exclusions from cohort
Beware of too much complexity req up front
Exclusions
Numerator

1.1.1.1.1.1 Retrospective removal from denominator

1.1.1.1.1.2 E.g. patient compliance

1.1.1.1.1.3 Patient leaves practice (death, move, change doctors)

Denominator

1.1.1.1.1.4 E.g. allergy to medication class

1.1.1.1.1.5 Failure of therapy trial

1.1.1.1.1.6 Contra-indicated conditions

Where in the workflow do you trigger the rule (inserting step in provider's workflow)

Variable in each implementation

Avoid too much prescription – want to show that it gets done

Could be through order

External to EHR (after the fact)

Vision – improve patient care – could be within EHR during visit, but early on will likely be post-visit

Prospective reporting/analysis before it affects the measure

Relevance to IHE (scope of activities)

PCC or Component Constructs

Payload Content
Message
Document
EMR Query
Performance Measurement Rule (XML, BPL, Gello?)

ITI

Publish/Subscribe
Query

Quality

Framework
Measures

Use Case

Example use cases

Adverse Event Reporting (Sentinel)

Hospital acquired infection
patient falls (unplanned descent to the floor
PQRI specifies 'with injury'
ANA Nursing does not specify 'with injury'

Measures

A1C
Antiplatelet Therapy (CAD-1): Percentage of patients with CAD who were prescribed antiplatelet therapy

Workflow for quality measurement

Insert Powerpoint flow diagram

Insert Aggregate data diagram from IHE ITI document

Include Swim Lane Diagram

Actor

Measure Rule Information Resource (Investigation Resource ) (some database of the encoded rules that may be pushed to or polled by the provider system

Payor

Accreditor

Licensor

Certifier

Provider Systems (CIS) (Information/document source) (Hospitals, ambulatory practices, pharmacies, labs, radiology clinics)

EMR

EHR

Disease Management System

Quality Management System

Risk Management System

Research Community

Measure definition

Providers

System-consumers

Analyzer

Clearinghouse/ Outsourced Measure Calculator/ Benchmarking Service Information System (receives raw from Provider System CIS)

Receivers (receives raw or aggregate data from Provider System CIS)

Performance Measure Implementers

System-Consumer/client decision system (Report Card system)

Person/individual
employer
regulator
Policy makers
licensing board
certification board
Public Health System
Payors
Accrediting Bodies (e.g. Joint Commission)

transaction

Glossary