Reject Analysis - Proposal

From IHE Wiki
Jump to navigation Jump to search

1. Proposed Workitem: Reject Analysis (XRA) Profile

  • Proposal Editor: Kevin O'Donnell, Kevin Little, Ingrid Reiser
  • Editor:
  • Domain: Radiology

Summary

Radiography is one of the most variable and challenging imaging modalities. Maintaining quality depends critically on a robust process to monitor, detect, and resolve image issues.

Some sites visit each radiography device in the organization monthly, weekly, or even daily, to review and gather reject information which is encoded differently by each vendor.

The existing DICOM KOS (Key Object Selection) rejection note used in IOCM offers a way for all radiography devices in a hospital to export reject information in a standard format for central handling.

A Reject Analysis profile could require compliant modalities and QA stations to produce such objects and could define transactions to actors that collect, handle and present such information.

AAPM Task Group TG305 characterized and analyzed this problem and provided recommendations. This profile proposal emerged from that activity. The TG305 report was released in May 2023 and includes recommendations for this profile. The lead authors are ready to participate in Profile development.

The problem and solution are typical of pragmatic IHE Radiology profiles like REM.

2. The Problem

The acquisition of radiographs is a technical process and variations in a variety of details (patient positioning, imaging technique, equipment performance) can yield non-diagnostic results.

Conscientious sites establish QA processes, of which reject analysis is a part, as an ongoing effort to maintain and improve imaging quality in the face of these challenges.

The conventional approach is for each x-ray device to keep local logs and copies of images, requiring site staff to schedule regular visits to each x-ray device. The log format, content, and level of detail are not standardized, nor is the GUI or method for accessing the information. Some devices internally summarize the data; some don't. It is often useful to review the images to understand the nature of the defect and its source. Some store full fidelity copies of problem images, others store reduced resolution or quality, and others store none at all.

All this makes it an enormous challenge to operate QA as a site-level program with uniform quality standards, rather than a replicated series of device-level programs, not to mention the inherent inefficiencies of repeating work differently at many different devices.

3. Key Use Case

Goal: Capture details of x-ray acquisitions and the results of QA steps to facilitate the later analysis of “rejects” (images that were non-diagnostic or at least sub-par in some way) to understand and improve.

Current workflow:

  • Modalities record some reject details in internal log files.
  • Staff periodically visit each acquisition device.
  • each acquisition model has a different GUI and method for accessing the log files
  • each vendors log file has different content and format
  • some allow the log to be exported in Excel or XML or as a text file
  • some do their own analysis and provide a report in some format
  • some keep reduced versions of rejected images as jpegs
  • Staff manually combine all the different logs and reports, and find some way to summarize

Proposed workflow (similar to IHE REM):

  • When an image is flagged as a problem on the Modality, it stores a KOS Rejection Note to the PACS study.
  • Modalities store rejected DICOM images either to PACS or to a configured alternate location
  • If the PACS understands KOS Rejection Notes, it will sequester the rejected images from clinical use, otherwise the images will need to be sent to an alternate location
  • Modalities store dose objects as described in the REM profile
  • When an image is flagged as a problem later on a QA station, it stores a KOS Rejection Note to the PACS study.
  • Staff periodically visit a centralized reject analysis workstation/package
  • the Reject Analysis retrieves all KOS from PACS along with the associated dose reports and images
  • the Reject Analysis prepares summary information and supports identification and resolution of general and case-specific issues
  • Standardized reject codes would facilitate easier cross-device and cross-hospital analysis

4. Standards and Systems

Systems

Modalities – primary focus on radiography, but could apply to any modality

Storage - PACS, VNA, etc.

Reject Analyzers – similar to REM Dose Information Reporters (same system?)

Registry - like REM Dose Registries

Standards

  • AAPM TG-305 Report – discussion of requirements
  • IHE REM – provides the overall model
  • IHE IOCM - provides baseline object definition and server behaviors
  • DICOM KOS (IOCM Rejection Note)
  • DICOM RDSR
  • DICOM Image IODs

5. Technical Approach

This is mostly a clone of the IHE REM profile with an additional payload of IOCM rejection notes. Depending on the situation, QA reviewers would look at detailed or summary information and might consult the rejection notes, the associated dose reports, and the rejected images themselves.

Actors

  • Acquisition Modality
  • Image Manager/Archive
  • (NEW) Reject Analyzer
  • Image Display?
  • (NEW?) Registry?

Transactions

Re-use with some clarification/revision

  • Store Rejection Note [RAD-66]

Re-use without meaningful revision

  • Store RDSR [RAD-63]
  • Store Image [RAD-8]
  • Query/Retrieve Key Image Note [RAD-30/31], RDSR [RAD-64/65], Images [RAD-14/16]

Profile

  • Copy/Paste/Update REM (Radiation Exposure Monitoring)
  • Document Use Case
  • Borrow some IM/IA behaviors from IOCM

Decisions/Topics/Uncertainties

  • How should we handle coarse vs fine-grained rejection reasons? Mapping from fine to coarse on Analyzer? Require modality to provide both?
  • Where can we get a good list of rejection reasons for modalities other than X-ray?

6. Support & Resources

AAPM in general, and AAPM TG305 members in particular, will be central. Kevin Little signing on as co-editor. Paul Kinahan (RIC-AAPM Liaison) available to support/coordinate.

Q. Can we get statements of interest to implement/prototype? Esp. Modalities and Analyzers

7. Risks

  • Competing priorities for implementation/deployment bandwidth

8. Tech Cmte Evaluation

Effort Evaluation (as a % of Tech Cmte Bandwidth):

  • xx% for MUE
  • yy% for MUE + optional

Editor:

Kevin Little & Kevin O'Donnell

SME/Champion:

(both sides covered with Editors)