Difference between revisions of "Reject Analysis - Proposal"

From IHE Wiki
Jump to navigation Jump to search
Line 13: Line 13:
 
All this makes it an enormous challenge to operate QA as a site-level program with uniform quality standards, rather than a replicated series of device-level programs, not to mention the inherent inefficiencies of repeating work differently at many different devices.   
 
All this makes it an enormous challenge to operate QA as a site-level program with uniform quality standards, rather than a replicated series of device-level programs, not to mention the inherent inefficiencies of repeating work differently at many different devices.   
  
The goal is to capture details of x-ray acquisitions and the results of QA steps to facilitate the later analysis of “rejects” (images that were non-diagnostic or at least sub-par in some way) to understand and improve.
 
  
 
==3. Key Use Case==
 
==3. Key Use Case==
 +
'''Goal:''' Capture details of x-ray acquisitions and the results of QA steps to facilitate the later analysis of “rejects” (images that were non-diagnostic or at least sub-par in some way) to understand and improve.
 +
 
Current workflow:
 
Current workflow:
 
:* Modalities record some reject details in internal log files.
 
:* Modalities record some reject details in internal log files.

Revision as of 19:44, 27 August 2019

1. Proposed Workitem: Reject Analysis (XRA) Profile

  • Proposal Editor:
  • Editor:
  • Domain: Radiology

2. The Problem

The acquisition of radiographs is a technical process and variations in a variety of details (patient positioning, imaging technique, equipment performance) can yield non-diagnostic results.

Site QA processes, of which reject analysis is a part, are an ongoing effort to maintain and improve imaging quality in the face of these challenges.

The conventional approach is for each x-ray device to keep local logs and copies of images, requiring site staff to schedule regular visits to each x-ray device. The log format, content, and level of detail are not standardized, nor is the GUI or method for accessing the information. Some devices internally summarize the data. It is often useful to review the images to understand the nature of the defect and it’s source. Some store full fidelity copies of problem images, others store reduced resolution or quality, and others store none at all

All this makes it an enormous challenge to operate QA as a site-level program with uniform quality standards, rather than a replicated series of device-level programs, not to mention the inherent inefficiencies of repeating work differently at many different devices.


3. Key Use Case

Goal: Capture details of x-ray acquisitions and the results of QA steps to facilitate the later analysis of “rejects” (images that were non-diagnostic or at least sub-par in some way) to understand and improve.

Current workflow:

  • Modalities record some reject details in internal log files.
  • Staff periodically visit each acquisition device.
  • each acquisition model has a different GUI and method for accessing the log files
  • each vendors log file has different content and format
  • some allow the log to be exported in Excel or XML or as a text file
  • some do their own analysis and provide a report in some format
  • some keep reduced versions of rejected images as jpegs
  • Staff manually combine all the different logs and reports, and find some way to summarize

Proposed workflow (similar to REM):

  • When an image is flagged as a problem on the Modality, it stores a KOS Rejection Note to the PACS study.
  • Modalities are configured to either store rejected DICOM images to PACS or a configured alternate location
  • If the PACS understands KOS Rejection Notes, it will sequester the rejected images from clinical use, otherwise the images will need to be sent to an alternate location
  • Modalities store dose objects as described in the REM profile
  • When an image is flagged as a problem on a QA station, it stores a KOS Rejection Note to the PACS study.
  • Staff periodically visit an single reject analysis workstation/package
  • the Reject Analysis retrieves all KOS from PACS along with the associated dose reports and images
  • the Reject Analysis prepares
  • Standardized reject codes would facilitate easier cross device and cross hospital analysis

4. Standards and Systems

Modalities – the initial focus would be strongly around radiography but could apply to any modality

Storage - PACS, VNA, etc.

Reject Analyzers - like REM Dose Information Reporters (same system?)

Registry - like REM Dose Registries

Standards

  • AAPM TG-305 Report – discussion of requirements
  • IHE REM – provides the overall model
  • DICOM KOS (IOCM Rejection Note)
  • DICOM RDSR
  • DICOM Image IODs

5. Discussion

This would mostly be a clone of the IHE REM profile with an added payload of IOCM rejection notes. Depending on the situation, QA reviewers would look at detailed or summary information and might consult the rejection notes, the associated dose reports, and the rejected images themselves.

Risks