Reporting Worklist Prioritization - Proposal

From IHE Wiki
Jump to navigation Jump to search

1. Proposed Workitem: Reporting Worklist Prioritization

  • Proposal Editor: Antje Schroeder, Kevin O'Donnell, Teri Sippel
  • Editor: TBA
  • Domain: Radiology

2. The Problem

At any point in time, there are N items on the reading worklist when a new reading task arrives, so the Reading Worklist Manager has to choose to prioritize the new task into one of N+1 possible positions on the list. All priorities are relative.

While the logic (and preferences) for which other tasks a task should come before is internal to the reading worklist provider (and hopefully customizable), all the details for each task that may be relevant to prioritization come from outside the reading worklist provider so getting access to them is an interoperability problem.

An AI Application that detects lung nodules or that rules out strokes can't tell the reading worklist provider where to prioritize a new study in the worklist because the AI Application doesn't know what else is on the worklist. But the AI can contribute positive or negative findings that can feed the logic of the worklist provider. Other systems will also need to provide (and update) relevant information. And certain details that the worklist provider wants will need the encoding to be standardized (especially if it comes from multiple implementations)

What findings are a higher priority than a partial stenosis of a coronary artery and what are lower is something that will be encoded into the central prioritizer, not each AI Application (granting that the prioritizer might eventually be an AI itself...)

3. Key Use Case

Consider sitting down with a few chief radiologists and radiology department administrators and asking “If you were manually prioritizing this couple dozen entries in a worklist, what does “optimal ordering” mean to you, what characteristics would drive your relative priority choices, how do you handle tradeoffs between a low confidence high risk finding and a high confidence moderate risk finding." etc.

The profile use case should spell out:

  • What pieces of information does the prioritizer need/use to prioritize worklist items
  • priority of the underlying imaging order
  • agreed turn around times (service level agreements)
  • imaging procedure type
  • patient type/location
  • Reading Physician availability
  • known/suspected risks to the patient
  • admitting diagnosis, reason for procedure
  • anatomy being imaged
  • preliminary findings from one or more AI algorithms
  • stoke is present, stroke is absent
  • severity of hemorrhage (large/small, located in a more critical area or a less critical area)
  • confidence of finding
  • lab findings, pathology findings
  • when is the patient expected to leave the facility
  • whether associated information (e.g. lab results, CAD results) is available yet, and if not, when
  • What system(s) can provide each piece of information and how is it encoded, e.g.
  • some from modality (via PACS) in the image header
  • some from analysis application encoded in DICOM SR content
  • some from the Order Placer in the incoming Order message
  • some from a support system in a Procedure Update OMI, ….
  • What codesets are used for certain concepts, e.g.
  • finding codes, anatomy codes, severity codes, etc.

The profile should define:

  • What system has the relevant information listed above
  • How that system encodes and conveys that information to the prioritizer
  • Whether to proxy/aggregate the information to make it more digestible for a prioritizer

4. Standards and Systems

Systems:

  • Reporting Worklist Manager (RIS/PACS)
  • PACS - study data source
  • AI Applications - presence/absence of various findings
  • Might need adjustments to MAP Profile to get additional details incorporated in the results
  • EMR - admission and patient record information
  • Staff/Scheduling System - expertise and availability of reading staff

5. Discussion

Might interact with the AIR Data/Root Results proposal. The Root Results might serve as a useful summary, and if more details are needed it might warrant extending attributes in existing MAP.

How should workitem this be packaged? As an option on SWF.b? As an extension of one of the reporting workflow profiles? As a revision to AIW (given that AI results are a high profile piece of data but not the only data of interest)?


  • Paste this text into a copy of your Brief Proposal
  • Move the Summary section here to the end of Section 1 in your Brief Proposal
  • Expand details in the Use Case Section of your Brief Proposal
  • Distribute material in the Discussion Section of your Brief Proposal into the other bottom sections (5,6,7,8) here.


Summary

<Summarize in a few lines the existing problem . E.g. "It is difficult to monitor radiation dose for individual patients and almost impossible to assemble and compare such statistics for a site or a population.">

<Demonstrate in a line or two that the key integration features are available in existing standards. E.g. "DICOM has an SR format for radiation dose events and a protocol for exchanging them.">

<Summarize in a few lines how the problem could be solved. E.g. "A Radiation Dose profile could require compliant radiating devices to produce such reports and could define transactions to actors that collect, analyze and present such information.">

<Summarize in a line or two market interest & available resources. E.g. "Euratom and ACR have published guidelines requiring/encouraging dose tracking. Individuals from SFR are willing to participate in Profile development.">

<Summarize in a line or two why IHE would be a good venue to solve the problem. E.g. "The main challenges are dealing with the chicken-and-egg problem and avoiding inconsistent implementations.">


5. Technical Approach

<This section describes the technical scope of the work and the proposed approach to solve the problems in the Use Cases. The Technical Committee will be responsible for the full design and may choose to take a different approach, but a sample design is a good indication of feasibility. The Technical Committee may revise/expand this section when doing the effort estimation.>

<If any context or "big picture" is needed to understand the transaction, actor and profile discussion below, that can be put here>

<If a phased approach would make sense indicate some logical phases. This may be because standards are evolving, because the problem is too big to solve at once, or because there are unknowns that won’t be resolved soon.>

<The material below also serves as the breakdown of tasks that the technical committee will use to estimate the effort required to design, review and implement the profile. It helps a lot if it is reasonably complete/realistic.>


<READ PROPOSER HOMEWORK IN Proposal Effort Evaluation FOR GUIDANCE ON POPULATING THE FOLLOWING SECTIONS >

Actors

  • (NEW) <List possible new actors>
  • <List existing actors that may be given requirements in the Profile.>

Transactions

  • (NEW) <List possible new transactions (indicating what standards would likely be used for each. Transaction diagrams are very helpful here. Feel free to go into as much detail as seems useful.>
  • <List existing transactions that may be used and which might need modification/extension.>

Profile

  • <Describe the main new profile chunks that will need to be written.>
  • <List existing profiles that may need to be modified.>

Decisions/Topics/Uncertainties

  • <List key decisions that will need to be made, open issues, design problems, topics to discuss, and other potential areas of uncertainty>
  • <Credibility point: A proposal for a profile with any degree of novelty should have items listed here. If there is nothing here, it is usually a sign that the proposal analysis and discussion has been incomplete.>

6. Support & Resources

<List groups that have expressed support for the proposal and resources that would be available to accomplish the tasks listed above.>

<Identify anyone who has indicated an interest in implementing/prototyping the Profile if it is published this cycle.>

7. Risks

<List real-world practical or political risks that could impede successfully fielding the profile.>

<Technical risks should be noted above under Uncertainties.>

8. Tech Cmte Evaluation

<The technical committee will use this area to record details of the effort estimation, etc.>

Effort Evaluation (as a % of Tech Cmte Bandwidth):

  • xx% for MUE
  • yy% for MUE + optional

Editor:

TBA

SME/Champion:

TBA <typically with a technical editor, the Subject Matter Expert will bring clinical expertise; in the (unusual) case of a clinical editor, the SME will bring technical expertise>