MESA/Report Reader

From IHE Wiki
Revision as of 12:44, 3 October 2019 by Felhofer (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search
The test definitions on this page are RETIRED, but are kept here, for now, as an archive.


Report Reader Tests

Introduction

Report Reader systems are tested in three areas.

1. Each Report Reader will be asked to send queries (DICOM C-Find, DICOM C-Move) to a MESA Report Repository. These queries are analyzed for correctness according to the DICOM Standard.

2. Each Report Repository will be asked to query by specific attributes (e.g. Accession Number as a matching key) as defined in section 6.14 of the IHE Technical Framework: Year 3.

3. Each Report Repository will be asked to render a set of SR objects and send a copy of the display to the Technical Project Manager (email, mail, fax).

Integration Profiles and Test Procedures

This document lists a number of tests for Report Reader Systems. You may not be responsible for all of these tests. Please refer to the Connectathon web tool to list the required tests for your system. The web address of this tool depends on the year and project manager. Please contact the appropriate project manager to obtain this information.

Message Attributes

Report Readers may make queries using a number of attributes. The tests defined in this document will request queries by specific attributes listed in the table below. It is expected that the Report Reader software will contain other attributes as well; the tests require only that some attributes are present. The tests also allow you to perform multiple queries to cover all of the requested attributes. That is, we list a number of attributes in the table below and do not expect your system to use all of these attributes as matching keys in a single query.

Matching Key Attributes for Report Readers

Attribute Name Tag
Study Date 0008 0020
Accession Number 0008 0050
Patient Name 0010 0010
Patient ID 0010 0020
Modalities in Study 0008 0061

Message Values

Tests described in section 2 will require specific values in the matching keys. These are defined in tables in section 2.

Configuration

The MESA Image Manager and MESA Report Repository each maintain a database of DICOM applications used for C-Move operations. Add an entry for the storage SCP associated with your workstation. Edit the text file $MESA_TARGET/db/loaddicomapp.pgsql (Unix) or $MESA_TARGET/db/loaddicomapp.sql (Windows NT) Use the existing entries as a template and add entries for your workstations as appropriate. The column names found in the SQL insert statements are described in the following table.

Column Name Description
aet DICOM Application Entity Title. Must be unique.
host Host name (or IP address) of the application.
port TCP/IP port number for receiving associations.
org The organization that operates the device. Useful if multiple organizations use the Image Manager.
com A comment field.

You can test your work as follows:

    perl  load_apps.pl imgmgr
    perl load_apps.pl  rpt_repos

The files $MESA_TARGET/runtime/rpt_repos/ds_dcm.cfg and $MESA_TARGET/runtime/imgmgr/ds_dcm.cfg are used to configure the MESA Report Repository and MESA Image Manager (separately). The only parameter users should change is the LOG_LEVEL value. Log levels are defined in Starting the MESA Servers. DICOM configuration parameters are listed in the table below.

Application AE Title Port
MESA Report Repository REPORT_ARCHIVE 2800
MESA Image Manager MESA_IMG_MGR 2350

Read the Runtime Notes section of the Installation Guide to determine the proper settings for the MESA runtime environment.

Starting the MESA Servers

These instructions assume you are using a terminal emulator on Unix systems or an MS DOS command window under Windows NT. Each test uses a command line interface; there is no graphical user interface. Before you start the test procedure, you need to start the MESA Report Repository and MESAImage Manager servers. Make sure the appropriate database is running (PostgreSQL, SQL Server). To start the MESA servers:

1. Enter the Report Reader exam folder: mesa_tests/rad/actors/rpt_reader.

2. Execute the appropriate script to start the servers:

     scripts/start_mesa_servers.csh  (Unix)
     scripts\start_mesa_servers.bat  (Windows)

Log levels are set for the MESA Image Manager in the file: $MESA_TARGET/runtime/imgmgr/ds_dcm.cfg. Log levels are:

0. no logging

1. errors

2. warnings

3. verbose

4. conversational (really verbose)

When you are finished running one or more tests, you can stop the servers:

    scripts/stop_mesa_servers.csh  (Unix)
    scripts\stop_mesa_servers.bat  (Windows)

Log files are stored in $MESA_TARGET/logs.

For the security tests, the MESA servers are started with different scripts. These are scripts/start_mesa_secure.csh and scripts\start_mesa_secure.bat. The log levels are the same as for the standard tests. The MESA servers are stopped using these scripts: scripts/stop_mesa_secure.csh and scripts\stop_mesa_secure.bat.

Loading Test Data

The Report Reader tests use a common set of Image and SR objects. Report Readers that are not image displays may ignore the Image objects. You may load these objects into the MESA servers one time before any of the tests are started. After you start the MESA servers as described in Starting the MESA Servers, load the test data:

    perl  90x/load_90x.pl

This script loads images into the MESA Image Manager and SR objects into the MESA Report Repository.

Individual Tests

Test 600: SINR - Sample DICOM Reports

In this test, SINR Report Readers will examine the sample reports provided by Report Creators. The goal of the test is to make sure the Report Reeaader actors are not surprised by content when they arrive at a Connectathon.

References

Instructions

  1. Find reports uploaded by other vendors for test 600 in gazelle under Connectathon -> List of samples. This page will evolve as vendors add samples, so be patient.
  2. Retrieve the files created by the other vendors. Examine/import/render them so that you are confident your software understands the content.
  3. You will find a table listing the systems which have submitted samples for test 600. Extract this table and place it in a spreadsheet that can be read by Excel. Create columns labelled Reviewed (Y/N) and Comments
  4. When you are finished, upload the spreadsheet as the results for this test.
  5. If you find issues with the samples, send an email to the Connectathon Manager now to wake him or her up. You can also contact the sample provider directly to resolve issues.
  6. The goal is no surprises.

Evaluation

The evaluation of this test is performed by examining the spreadsheet you provided to make sure you made a good faith effort to review the sample images.

Supplemental Information

Report Reader Test 601: Simple Image Report

References

Instructions

1. Create/modify the SQL script to identify the Report Reader under test.

2. Start the MESA servers as described in Starting the MESA Servers above.

3. Load the data sets into the MESA Report Repository as described in Loading Test Data above.

4. Retrieve the SR for patient CRTHREE^PAUL. This report will have 0 image references. Render the report and send a copy of the rendered report to the Project Manager.

Evaluation

Supplemental Information

Report Reader Test 602: Simple Image Report with One Reference

References

Instructions

1. Create/modify the SQL script to identify the Report Reader under test.

2. Start the MESA servers as described in Starting the MESA Servers above.

3. Load the data sets into the MESA Report Repository as described in Loading Test Data above.

4. Retrieve the SR for patient CTFIVE^JIM. This report will have 1 image reference to an image stored on the MESA Image Manager. You may retrieve and display that image if your application supports that feature. If you do not display images, you need to at least indicate there is a reference to an image.

5. Render the report and send a copy of the rendered report to the Project Manager.

Evaluation

Supplemental Information

Report Reader Test 603: Simple Image Report with Two References

References

Instructions

1. Create/modify the SQL script to identify the Report Reader under test.

2. Start the MESA servers as described in Starting the MESA Servers above.

3. Load the data sets into the MESA Report Repository as described in Loading Test Data above.

4. Retrieve the SR for patient MRTHREE^STEVE. This report will have references to two series stored on the MESA Image Manager. You may retrieve and display those images if your application supports that feature. If you do not display images, you need to at least indicate there is a reference to a series with images.

5. Render the report and send a copy of the rendered report to the Project Manager.

Evaluation

Supplemental Information

Report Reader Test 611: Numeric Report

References

Instructions

1. Create/modify the SQL script to identify theReport Reader under test.

2. Start the MESA servers as described in Starting the MESA Servers above.

3. Load the data sets into the MESA Report Repository as described in Loading Test Data above.

4. Retrieve the SR for patient CRTEN^GEORGE. This report has a measurement of the length of the patient’s left leg.

5. Render the report and send a copy of the rendered report to the Project Manager.

Evaluation

Supplemental Information

Report Reader Test 901: SR SCU Query Keys

In this test, the Report Reader is required to query the MESA Report Repository using specific matching keys. For each matching key and value in the table below, direct the Report Reader to make one or more queries of the MESA Report Repository. Repeat or multiple queries are allowed. That is, you might choose to query several times with a certain matching key. We do not expect the Report Repository to send individual queries with multiple matching keys (Patient Name and Patient ID), but the test software will allow that.

Attribute Name Tag Matching Key Value
Study Date 0008 0020 19950126
Accession Number 0008 0050 2001B20
Patient Name 0010 0010 CRTHREE*
Patient ID 0010 0020 CR3
Modalities in Study 0008 0061 MR

References

Instructions

1. Create/modify the SQL script to identify the Report Reader under test.

2. Start the MESA servers as described in Starting the MESA Servers above.

3. Load the data sets into the MESA Report Repository as described in Loading Test Data above.

4. Send at least one DICOM Study Level C-Find request (Study Root model) to the MESA Report Repository for each attribute/matching key value defined in the table above.

Evaluation

Run the evaluation script to verify that each attribute was requested in a query.

    perl 901/eval_901.pl <AE Title of Report Reader>

Supplemental Information

Results will be found in the file 901/grade_901.txt. If you need to clear the existing queries to run the test again, you can restart at step 3 or run this script:

    perl  scripts/clear_queries.pl

Report Reader Test 902: SCU Query Evaluation

This test uses the queries sent by the Report Reader during Test 901 and any other queries you want to evaluate. This test examines all queries sent by the Report Reader to determine if they are legal DICOM queries. After you conclude Test 901, the MESA Report Repository will still have a record of the queries sent by your Report Reader. If you want to send more queries to the Report Repository, you may do so. There are no required queries. You might want to send queries at the Series and SOP Instance level.

References

Instructions

Evaluation

Evaluate the Report Reader queries as follows:

    perl  902/eval_902.pl <AE Title of Report Reader>

Supplemental Information

Query results are stored in the file 902/grade_902.txt. As above, you can clear the queries stored by the MESA Report Repository as follows:

    perl scripts/clear_rpt_repos_queries.pl

Basic Security Tests

This section describes tests that are specific to the IHE Basic Security integration profile. If you have the MESA servers running for the “standard” tests, you should stop those servers now. You will need to start the MESA secure servers with a different script.

Report Reader Test 1511: Simple Imaging Report

Report Reader Test 1511 uses the same structure as test 601. The Report Reader is expected to communicate with other systems using TLS negotiation and to send appropriate audit messages to the MESA syslog server. The table below lists the Audit Messages that should be generated by your Report Reader. Please refer to the document IHE Tests: Transaction Sequences for the full context of these messages. You might trigger other messages to the Audit Record Repository based on your interaction with your Report Reader.

Identifier Description Source Destination
1511.022 DICOM-instances-used Report Reader Audit Record Repos

References

Instructions

1. Create/modify the SQL script to identify the Report Reader under test. This may be different from the step for test 601 for your secure node.

2. Start the secure MESA servers as instructed in Starting the MESA Servers.

3. Load the data sets into the MESA Report Repository:

    perl 90x/load_90x_secure.pl

4. Clear the MESA Audit Record Repository:

    perl scripts/clear_audit.pl

5. Retrieve the SR for patient CRTHREE^PAUL. This report will have 0 image references. Render the report and send a copy of the rendered report to the Project Manager.

Evaluation

1. Evaluate the Audit Records produced by your system:

    perl 1511/eval_1511.pl

Supplemental Information

Grab all of the files (tar/zip) in $MESA_TARGET/logs/syslog and send these to the Project Manager.

Test Cases: PDI

These test cases are generally associated with the Radiology PDI profile

Test Case 1931: Media “Reader” Read RSNA 2005 CD

The purpose of this test is for the Display actor to open the DICOMDIR file on the RSNA 2005 CD and render the images and other composite objects on the CD.

References

The sample CDs are found at http://ihedoc.wustl.edu/mesasoftware/10.15.0/dist/ext_data/pdi_2005/index.htm


RAD TF

Instructions

To run this test, follow these steps:

1. Obtain the RSNA 2005 PDI Demonstration CD. If you do not have a physical copy of the CD, download the ISO image of the CD from the MESA distribution page and create a CD from the ISO image.

2. Use your DICOM application to open the DICOMDIR file on the RSNA CD.

3. Select and display all of the studies on the CD. Look for studies with DICOM SR objects.

4. Create a text file that lists each vendor that supplied data. Add the follow comment for each study: Yes, could render DICOM SR. No, could not render SR. NA, no SR present. If there are DICOM SR objects that have problems, submit further screen captures or other documentation that illustrates the issue.

Evaluation

1. Submit the text file to the Kudu/Gazelle tool. If there are problems with any SR objects, follow up with email to the Project Manager.

Supplemental Information

Test Case 1932: Media “Reader” Reads Vendor CDs

The purpose of this test is for the Display actor to open the DICOMDIR file on CDs provided by vendors for the RSNA 2005 PDI demonstration and to render the composite objects on those CDs.

References

The sample CDs are found at http://ihedoc.wustl.edu/mesasoftware/10.15.0/dist/ext_data/pdi_2005/index.htm

Instructions

To run this test, follow these steps:

1. Obtain the vendor CDs from the RSNA 2005 PDI Demonstration. If you do not have a physical copy of the CDs, download the ISO images from the MESA distribution page and create a CD from the ISO image.

2. Use your DICOM application to open the DICOMDIR file on each vendor CD.

3. Select and display all of the studies on the CD. Look for DICOM SR objects.

4. Create a text file that lists each vendor CD. For each vendor add the following comment: Yes, could render DICOM SR. No, DICOM SR present but with errors. NA, no DICOM SR present. If there are problems with any DICOM SR, submit screen captures or other evidence to illustrate the problem.

Evaluation

1. Submit the text file to the Kudu/Gazelle tool. If there are problems with any CDs, follow up with email to the Project Manager.

Supplemental Information

Test Cases for Cardiology

Test 20526: Example DICOM Reports

In this test, Report Readers will examine the sample reports provided by Report Managers which support the DICOM Storage Option in DRPT. The goal of the test is to make sure the Report Readers actors are not surprised by content when they arrive at a Connectathon.

References

Instructions

  1. On this wiki, ([[1]]) find the page that lists Test Software/Data for your Connectathon. Do not use data from a different connectathon unless you are explicitly given that instruction from the Connectathon Manager.
  2. On the Test Software/Data page, you should see a link to Shared Data and test 20526. That will give you access to a page that lists sample images and reports uploaded by other vendors.
  3. Retrieve the zip files created by the other vendors. Examine/import/render them so that you are confident your software understands the content.
  4. You will find a table called Results Table on the samples for test 20526. Extract this table and place it in a spreadsheet that can be read by Excel. Fill in the columns labelled Reviewed and Comments. The Comments field is mainly an indicator of issues with the images. If there are no issues, you can leave that blank.
  5. When you are finished, upload the spreadsheet into the Kudu / Gazelle tool.
  6. If you find issues with the DICOM Encapsulated PDF object, send an email to the Connectathon Manager now to wake him or her up. You can also contact the Report Manager representative directly to resolve issues.
  7. The goal is no surprises.

Evaluation

The evaluation of this test is performed by examining the spreadsheet you provided to make sure you made a good faith effort to review the sample images.

Supplemental Information


Test Cases for Eye Care

Test 50217: Displayable Reports -- Example Report

In this test, Report Readers will examine the sample reports provided by other participants. The goal of the test is to make sure the Report Reader actors are not surprised by content when they arrive at a Connectathon.

References

Instructions

  1. On this wiki ([[2]]), find the page that lists Test Software/Data for your Connectathon. Do not use data from a different connectathon unless you are explicitly given that instruction from the Connectathon Manager.
  2. On the Test Software/Data page, you should see a link to Shared Data and test 50217. That will give you access to a page that lists sample images uploaded by other vendors.
  3. Retrieve the zip files created by the other vendors. Examine/import/render them so that you are confident your software understands the content.
  4. You will find a table called Results Table on the samples for test 50217. Extract this table and place it in a spreadsheet that can be read by Excel. Fill in the columns labelled Reviewed and Comments. The Comments field is mainly an indicator of issues with the images. If there are no issues, you can leave that blank.
  5. When you are finished, upload the spreadsheet into the Kudu / Gazelle tool.
  6. If you find issues with the images, send an email to the Connectathon Manager now to wake him or her up. You can also contact the image provider directly to resolve issues.
  7. The goal is no surprises.

Evaluation

The evaluation of this test is performed by examining the spreadsheet you provided to make sure you made a good faith effort to review the sample reports.

Supplemental Information