MESA/Report Repository

From IHE Wiki
Jump to navigation Jump to search
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.
The test definitions on this page are RETIRED, but are kept here, for now, as an archive.


Report Respository Tests


Report Repository Tests

The MESA tools are installed on a computer referred to as the MESA test system. The tools are written under the assumption that this computer is distinct from your Report Repository (the system under test). If you choose to run the MESA tools on the same system as your Report Repository, you are likely to have conflicts in port assignments. You will need to use different port numbers for your application because the MESA applications use hard-wired port numbers.


Introduction

Report Repositories are tested on responses to C-Find requests and C-Move requests. Each test is run using the same procedure. We assume you are using an interactive terminal or terminal emulator and are logged on to the MESA test system. Change directory to $MESA_TARGET/mesa_tests/rad/actors/rpt_repos.

Integration Profiles and Test Procedures

This document lists a number of tests for Report Repository Systems. You may not be responsible for all of these tests. Please refer to the Connectathon web tool to list the required tests for your system. The web address of this tool depends on the year and project manager. Please contact the appropriate project manager to obtain this information.


Message Attributes

These tests will query the Report Repository for attributes using the DICOM C-Find command. The table below lists the attributes used in the queries.

Attribute Name Tag
Study Level
Study Date 0008 0020
Study Time 0008 0030
Accession Number 0008 0050
Patient Name 0010 0010
Patient ID 0010 0020
Study ID 0020 0010
Study Instance UID 0020 000D
Modalities in Study 0008 0061
Referring Physician’s Name 0008 0090
Patient’s Birth Date 0010 0030
Patient’s Sex 0010 0040
Number of Study Related Series 0020 1206
Number of Study Related Instances 0020 1208
Series Level
Modality 0008 0060
Series Number 0020 0011
Series Instance UID 0020 000D
Number of Series Related Instances 0020 1209
Request Attribute Sequence 0040 0275
>Requested Procedure ID 0040 1001
>Scheduled Procedure Step ID 0040 0009
Performed Procedure Step Start Date 0040 02444
Performed Procedure Step Start Time 0040 0245
Composite Object Instance Level
Image Number 0020 0013
SOP Instance UID 0008 0018
SOP Class UID 0008 0016
SR Specific Attributes
Completion Flag 0040 A491
Verification Flag 0040 A493
Content Date 0008 0023
Content Time 0008 0033
Obsservation Date Time 0040 A032
Verifying Observer Sequence 0040 A073
> Verifying Organization 0040 A027
> Verification Date Time 0040 A030
> Verifying Observer Name 0040 A075
> Verifying Observer Identification Code Sequence 0040 A088
Referenced Request Sequence 0040 A370
> Study Instance UID 0020 000D
> Accession Number 0008 0050
> Requested Procedure ID 0040 1000
> Rrequested Procedure Code Sequence 0032 1064
>> Code Value 0008 0100
>> Coding Scheme Designator 0008 0102
>> Code Meaning 0008 0104
Concept Name Code Sequence 0040 A043
> Code Value 0008 0100
> Coding Scheme Designator 0008 0102
> Coding Scheme Version 0008 0103
> Code Meaning 0008 0104

Message Values

The expected values in C-Find responses and SR objects retrieved from the Report Repository are taken directly from the SR instances loaded into the Report Repository. We do not further document these. If your responses do not match expected values, the evaluation scripts will print both your reponse and the expected value.


Configuration

The Report Repository scripts described below use an ASCII configuration file to identify parameters such as host names and port numbers. The configuration file is named rptrepos_test.cfg and is included in the directory $MESA_TARGET/mesa_tests/rad/actors/rpt_repos. Edit the file and change entries (host name, port number, AE title) which pertain to your system. Your system is identified by entries that begin with TEST. The file $MESA_TARGET/runtime/rpt_repos/ds_dcm.cfg is used to configure the MESA Report Repository. A workstation server is used as a Report Reader. The configuration file for this server is found in $MESA_TARGET/runtime/wkstation/ds_dcm.cfg. The only parameter users should change is the LOG_LEVEL value. Log levels are defined in Starting the MESA Servers. The applications listed in the table below are MESA applications that execute on the MESA test system. Configure your system to recognize these servers; you will be asked to move (C-Move) composite objects to the WORKSTATION1 application.


Application AE Title Host Port
MESA Report Repository REPORT_ARCHIVE MESA test system 2800
MESA Report Reader WORKSTATION1 MESA test system 3001
MESA Audit Record Repository MESA test system 4000

Read the Runtime Notes section of the Installation Guide to determine the proper settings for the MESA runtime environment.


Starting the MESA Servers

These instructions assume you are using a terminal emulator on Unix systems or an MS DOS command window under Windows NT. Each test uses a command line interface; there is no graphical user interface. Before you start the test procedure, you need to start the MESA Report Repository and MESA Report Reader. Make sure the appropriate database is running (PostgreSQL, SQL Server). To start the MESA servers:

1. Enter the Report Repository exam folder: mesa_tests/rad/actors/rpt_repos

2. Execute the appropriate script to start the servers:

    scripts/start_mesa_servers.csh  (Unix)
    scripts\start_mesa_servers.bat  (Windows)

When you are finished running one or more tests, you can stop the servers:

    scripts/stop_mesa_servers.csh  (Unix)
    scripts\stop_mesa_servers.bat  (Windows)

Log files are stored in $MESA_TARGET/logs. For the security tests, the MESA servers are started with different scripts. These are scripts/start_mesa_secure.csh and scripts\start_mesa_secure.bat. The log levels are the same as for the standard tests. The MESA servers are stopped using these scripts: scripts/stop_mesa_secure.csh and scripts\stop_mesa_secure.bat.


Loading Test Data

All tests will use a common set of test data. These should be loaded one time before any tests are run. You need to start the MESA servers first as described in Starting the MESA Servers. The MESA Report Repository receives a copy of the test data. You should clear your Report Repository of all data before loading the test data. Use this command to load the test SR objects into your Report Repository:

    perl  80x/load_80x.pl

Individual Tests

Test 600: SINR - Sample DICOM Reports

In this test, SINR Report Repositories will examine the sample reports provided by Report Creators. The goal of the test is to make sure the Report Repository actors are not surprised by content when they arrive at a Connectathon.

References

Instructions

  1. Find reports uploaded by other vendors for test 600 in gazelle under Connectathon -> List of samples. This page will evolve as vendors add samples, so be patient.
  2. Retrieve the files created by the other vendors. Examine/import/render them so that you are confident your software understands the content.
  3. You will find a table listing the systems which have submitted samples for test 600. Extract this table and place it in a spreadsheet that can be read by Excel. Create columns labelled Reviewed (Y/N) and Comments
  4. When you are finished, upload the spreadsheet as the results for this test.
  5. If you find issues with the samples, send an email to the Connectathon Manager now to wake him or her up. You can also contact the sample provider directly to resolve issues.
  6. The goal is no surprises.

Evaluation

The evaluation of this test is performed by examining the spreadsheet you provided to make sure you made a good faith effort to review the sample images.

Supplemental Information

Report Repository Test 801: Modalities In Study

The DICOM attribute 0008 0061 is named “Modalities in Study”. The 801 series tests will send four Study Level queries to the Report Repository and query on this attribute. The table below lists the four queries (a-d) and the values used for matching key. For the a query, the attribute 0008 0061 is sent as zero length, so this becomes a return key.

Query Value for Attribute 00080061
a “”
b “SR”
c “CT”
d “SR\CT”


References

Instructions

The d query is actually an illegal query and is included for your own testing. We do not evaluate the results of your response.

1. Start the MESA servers as decribed in Starting the MESA Servers. If you have not already loaded the test data into your system, do so as described in Loading Test Data.

2. Run the script which sends four queries to your Report Repository. This script sends the same queries to the MESA Report Repository so that we can compare results.

   perl 801/801_rptrepos.pl

Evaluation

1. Evaluate the response of your system to queries a, b, and c:

   perl 801/eval_801.pl

Supplemental Information

Report Repository Test 802: Related Series/SOP Instances

These tests query Report Repositories for DICOM attributes Number of Study Related Series, Number of Study Related SOP Instances, and Number of Series Related SOP Instances.

References

Instructions

1. Start the MESA servers as decribed in Starting the MESA Servers. If you have not already loaded the test data into your system, do so as described in Loading Test Data.

2. Run the script which sends two queries to your Report Repository. This script sends the same queries to the MESA Report Repository so that we can compare results.

   perl 802/802_rptrepos.pl

Evaluation

1. Evaluate the response of your system to queries 802a and 802b:

   perl 802/eval_802.pl

Supplemental Information


Report Repository Test 803: Other Report Repository Attributes

This test is not defined for IHE Year 4.


Report Repository Test 804: Report Specific Queries

This test queries for attributes at the SOP Instance level and compares the results to values stored in the MESA system. The table below lists the attributes requested in the queries.


0040 A491 Completion Flag
0040 A493 Verification Flag


References

Instructions

1. Start the MESA servers as described in Starting the MESA Servers. If you have not already loaded the test data into your system, do so as described in Loading Test Data.

2. Run the script to retrieve a report from the Report Repository:

    perl 804/804_rptrepos.pl

Evaluation

1. Evaluate the SR object retrieved from the Report Repository

    perl 804/eval_804.pl

Supplemental Information


Report Repository Test 811: SR Retrieve

These tests retrieve SR objects from the Report Repository and compare them to the original data. The MESA tools will issue a C-Move request with a Destination AE Title of WORKSTATION1. That refers to a server on the MESA system listening for DICOM Associations at port 3001.

References

Instructions

1. Start the MESA servers as decribed in Starting the MESA Servers. If you have not already loaded the test data into your system, do so as described in Loading Test Data.

2. Run the script to retrieve a report from the Report Repository:

   perl 811/811_rptrepos.pl

Evaluation

1. Evaluate the SR object retrieved from the Report Repository

    perl 811/eval_811.pl

Supplemental Information

Basic Security Tests

This section describes tests that are specific to the IHE Basic Security integration profile. If you have the MESA servers running for the “standard” tests, you should stop those servers now. You will need to start the MESA secure servers with a different script.

Report Repository Test 1511: Simple Imaging Report

Report Repository test 1511 uses a combination of events from other tests under the Basic Security integration profile. The Report Repository is expected to communicate with other systems using TLS negotiation and to send appropriate audit messages to the MESA syslog server. The table below lists the Audit Messages that should be generated by your Report Repository. Please refer to the document IHE Tests: Transaction Sequences for the full context of these messages. You might trigger other messages to the Audit Record Repository based on your interaction with your Report Repository.


Identifier Description Source Destination
1511.016 DICOMQuery Report Repository Audit Record Repos
1511.020 BeginStoringInstances Report Repository Audit Record Repos
1511.022 InstancesSent Report Repository Audit Record Repos

References

Instructions

1. Start the secure MESA servers as described in Starting the MESA Servers. Load the test data into your system as follows:

    perl 15xx/load_15xx.pl

2. Run the script to send queries to the Report Repository in the secure mode:

    perl 1511/1511_rptrepos.pl

Evaluation

1. Evaluate the SR object retrieved from the Report Repository

   perl 804/eval_1511.pl

Supplemental Information

1. Grab all of the files (tar/zip) in $MESA_TARGET/logs/syslog and send these to the Project Manager.


Test Cases for Cardiology

Test 20526: Example DICOM Reports

In this test, Report Repositories which support the DICOM Storage Option will examine the sample reports provided by Report Managers which support this option in DRPT. The goal of the test is to make sure the Report Repository actors are not surprised by content when they arrive at a Connectathon.

References

Instructions

  1. Find reports uploaded by other vendors for test 20526 in kudu under MESA Tests -> Pre-Connectathon...Objects. This page will evolve as vendors add samples, so be patient.
  2. Retrieve the files created by the other vendors. Examine/import/render them so that you are confident your software understands the content.
  3. You will find a table listing the systems which have submitted samples for test 20526. Extract this table and place it in a spreadsheet that can be read by Excel. Create columns labelled Reviewed (Y/N) and Comments
  4. When you are finished, upload the spreadsheet as the results for this test.
  5. If you find issues with the samples, send an email to the Connectathon Manager now to wake him or her up. You can also contact the sample provider directly to resolve issues.
  6. The goal is no surprises.

Evaluation

The evaluation of this test is performed by examining the spreadsheet you provided to make sure you made a good faith effort to review the sample images.

Supplemental Information

Test Cases for Eye Care

Test 50217: ECDR Example Report

In this test, Report Repositories will examine the sample reports provided by other participants. The goal of the test is to make sure the Report Repository actors are not surprised by content when they arrive at a Connectathon.

References

Instructions

  1. On this wiki, find the page that lists Test Software/Data for your Connectathon. Do not use data from a different connectathon unless you are explicitly given that instruction from the Connectathon Manager.
  2. On the Test Software/Data page, you should see a link to Shared Data and test 50217. That will give you access to a page that lists sample images uploaded by other vendors.
  3. Retrieve the zip files created by the other vendors. Examine/import/render them so that you are confident your software understands the content.
  4. You will find a table called Results Table on the samples for test 50217. Extract this table and place it in a spreadsheet that can be read by Excel. Fill in the columns labelled Reviewed and Comments. The Comments field is mainly an indicator of issues with the images. If there are no issues, you can leave that blank.
  5. When you are finished, upload the spreadsheet into the Kudu / Gazelle tool.
  6. If you find issues with the images, send an email to the Connectathon Manager now to wake him or her up. You can also contact the image provider directly to resolve issues.
  7. The goal is no surprises.

Evaluation

The evaluation of this test is performed by examining the spreadsheet you provided to make sure you made a good faith effort to review the sample reports.

Supplemental Information