Difference between revisions of "MESA/Evidence Creator"

From IHE Wiki
Jump to navigation Jump to search
(Created page with "<big>'''MESA/Evidence Creator'''</big> __TOC__ =Evidence Creator Tests= ==Introduction== This document describes several tests for Evidence Creator systems. The Display...")
(No difference)

Revision as of 10:55, 11 March 2019

MESA/Evidence Creator


Evidence Creator Tests

Introduction

This document describes several tests for Evidence Creator systems. The Display Consistency tests are defined in a separate document: Display Consistency Test Plan for Image Creator.

Integration Profiles and Test Procedures

This document lists a number of tests for Evidence Creator systems. You may not be responsible for all of these tests. Please refer to the Connectathon web tool to list the required tests for your system. The web address of this tool depends on the year and project manager. Please contact the appropriate project manager to obtain this information.

Message Attributes

Message Values

Configuration

The MESA Image Manager maintains a database of DICOM applications used for C-Move operations. Add an entry for the storage SCP(s) associated with your workstation. Edit the text file $MESA_TARGET/db/loaddicomapp.pgsql (Unix) or $MESA_TARGET/db/loaddicomapp.sql (Windows NT) Use the existing entries as a template and add entries for your workstations as appropriate. The column names found in the SQL insert statements are described in the following table.

Column Name Description
aet DICOM Application Entity Title. Must be unique.
host Host name (or IP address) of the application.
port TCP/IP port number for receiving associations.
org The organization that operates the device. Useful if multiple organizations use the Image Manager.
com A comment field.


You can test your work as follows:

   perl load_apps.pl imgmgr

The file $MESA_TARGET/runtime/imgmgr/ds_dcm.cfg is used to configure the MESA Image Manager. The only parameter users should change is the LOG_LEVEL value. Log levels are defined in section 1.5. DICOM configuration parameters are listed in the table below.

Application AE Title Port
MESA Image Manager MESA_IMG_MGR 2350


Read the Runtime Notes section of the Installation Guide to determine the proper settings for the MESA runtime environment.


Starting the MESA Servers

These instructions assume you are using a terminal emulator on Unix systems or an MS-DOS command window under Windows 2000 or XP. Each test uses a command line interface; there is no graphical user interface. Before you start the test procedure, you need to start the MESA Image Manager servers. Make sure the appropriate database is running (PostgreSQL, SQL Server). To start the MESA servers:

1. Enter the Image Display exam folder: $MESA_TARGET/mesa_tests/rad/actors/imgcrt or $MESA_TARGET/mesa_tests/rad/actors/evdcrt

2. Execute the appropriate script to start the servers:

   scripts/start_mesa_servers.csh  (Unix)
   scripts\start_mesa_servers.bat  (Windows)

Log levels are set for the MESA Image Manager in the file: $MESA_TARGET/runtime/imgmgr/ds_dcm.cfg. Log levels are:

1. no logging

2. errors

3. warnings

4. verbose

5. onversational (really verbose)

When you are finished running one or more tests, you can stop the servers:

   scripts/stop_mesa_servers.csh  (Unix)
   scripts\stop_mesa_servers.bat  (Windows)

Log files are stored in $MESA_TARGET/logs.

Starting Servers for Test 1701

The Test 1701 scripts are found in a different directory from the other Evidence Creator scripts. The Test 1701 scripts are in the evdcrt (evidence creator) directory, and use a slightly different server control procedures. Before you start the test procedure, you need to start the MESA Image Manager servers. Make sure the appropriate database is running (PostgreSQL, SQL Server). To start the MESA servers:

1. Enter the Evidence Creator folder: $MESA_TARGET/mesa_tests/rad/actors/evdcrt

2. Execute the appropriate script to start the servers:

   perl scripts/mesa_servers.pl start 

Log levels are as described in the section above. When you are finished running one or more tests, you can stop the servers:

    perl  scripts/mesa_servers.pl stop

Log files are stored in $MESA_TARGET/logs.


Submission of Results

Test descriptions below inform the reader to “submit results to the Project Manager”. This is does not mean “email”. The current submission process should be documented by the Project Manager, but will not include emailing files directly to the Project Manager.


Individual Tests

Test 251: Storage Commitment Association Negotiation

This is a test of association negotiation with your Evidence Creator. An Image Manager that wants to send Storage Commitment N-Event reports will initiate a DICOM association with your modality and should propose to be an SCP of the Storage Commitment SOP Class (Push Model).

In this test, one association will be proposed. The MESA Image Manager proposes the SCP role. Your Evidence Creator should accept this association and proposed presentation context.

References

Instructions

1. Edit the entries in [mesa]\mesa_tests\rad\actors\evdcrt\evdcrt_test.cfg to reflect your setup. You will need to change the three TEST_STORAGE_COMMIT parameters to reflect your AE, host, and port.

2. Start your server process that accepts Image Manager Storage Commitment association requests.

3. Run the evaluation script for this test. This script sends the appropriate association requests and records results in 251/grade_251.txt:

    perl 251/eval_251.pl [-v]

Evaluation

1. Submit the file 251/grade_251.txt generated above into gazelle as the result for this test. This file should indicate 0 errors for success.

Supplemental Information

Test 281: Example Images and other DICOM objects

Test 281 is used to collect sample images, Key Object Notes, Evidence Documents and/or other DICOM composite objects produced by an Evidence Creator. The intent of the test is to send DICOM composite objects (DICOM Part 10 format) to the Project Manager for redistribution to other participants. This will allow them time to test/examine your data before an in-person meeting.

This test differs from normal tests in that you submit samples to the Project Manager so other participants can review those samples. Because of this, please submit your samples 2 WEEKS before the normal deadlines. This will give the other systems a chance to review your data.

References

Instructions

1. Produce DICOM Part 10 files for the types of Evidence Documents your system system produces. If you produce more than one type of Evidence Document, produce at least one DICOM Part 10 file for each document type.

2. Assuming you have this ability, render the images/objects you produced as a reference. The goal is for an Image Display actor to examine your rendered images and compare to what their software produces. Perform a screen capture and/or save as JPEG or other reasonable format and place it in the appropriate subfolder.

3. Upload DICOM files and the screen capture snapshot into gazelle under Connectathon-->List of samples. Select your system name and Add a sample under DICOM_OBJECTS.

4. On the 'Samples to share' tab, find your uploaded file and perform validation on your sample. You may choose one or more of the DICOM validator tools available there.


5. Create a short txt file indicating you have completed the upload step. Upload that txt file into gazelle as the result file for this test.

6. As Image Display actors render your data, you may receive a request for interpretation or directives from the Project Manager to repair attributes. This may prove to be an iterative process.


Evaluation

Evaluation of this test occurs when other participants review your data. In the event that other participants find errors/issues, you may be asked to modify your data.

Test 282: Example GSPS Objects

The goal of this “test” is to provide samples for other vendors to display. You should send a “representative sample” of the data produced by your system.

GSPS objects are discussed/supported in the Consistent Presentation of Images integration profile. Evidence Creator actors create GSPS objects (requirement) and may optionally produce images. Evidence Creator actors that support the CPI integration profile should submit the GSPS objects they produce and any images produced. If you use images that are not part of the MESA test set as the basis for your GSPS objects, you should submit those, even if you did not produce them. That will allow other actors to display the original images and the appropriate GSPS objects.

Each system should send samples of the Image and/or GSPS objects that they create to the Image Manager.

These are to be submitted two weeks in advance of the general date for test results to allow other vendors the opportunity to test with them.

References

Instructions

Either create DICOM Part 10 files and submit it (see step 5-6) or follow the instructions below.

1. Start the MESA servers as described in Starting MESA Servers.

2. Clear the MESA Image Manager (if necessary). From $MESA_TARGET/mesa_tests/rad/actors/evdcrt:

   perl scripts/clear_img_mgr.pl

3. Send sample images/GSPS objects to the MESA Image Manager.

4. Find the files in $MESA_STORAGE/imgmgr/instances

5. Upload images/GSPS objects and the screen capture snapshot into gazelle under Connectathon-->List of samples. Select your system name and Add a GSPS 'Sample to share'. (Note that when the DICOM objects are uploaded, this may trigger an automatic DICOM validation service. You should take note of the output of the validation.)

6. Create a short txt file indicating you have completed the upload step. Upload that txt file into gazelle as the result file for this test.

7. As Image Display actors render your data, you may receive a request for interpretation or directives from the Project Manager to repair attributes. This may prove to be an iterative process.


Evaluation

The evaluation of this test comes in the form of feedback from other users of your data. If other users identify issues with your data, you will be asked to work with those users (and Project Manager) to resolve those issues.

Supplemental Information

Test 285: DICOM SOP Classes

In this test, you list the DICOM SOP classes and transfer syntaxes that are supported by your system. The goal of this test is to publish this list of SOP classes well in advance of Connectathon events so that other systems who might process your objects will not be surprised.

Instructions

  1. Create a text file named <gazelle_system_name>_285.txt. In the text file, enter your organization name, system name and date. List the DICOM SOP Classes and Transfer syntaxes your Evidence Creator supports. If you have a DICOM Conformance Statement which states this information, you may submit that instead.)
  2. Upload the text file into the Gazelle tool to signal that you have completed this task.

Evaluation

There is no formal evaluation for this test. The purpose is to alert other participants of the SOP Classes & Transfer Syntaxes to expect.

Test 301: Evidence Creator Storage and Commitment

In test 301, the Evidence Creator C-Stores one or more Evidence Documents to the MESA Image Manager. The Evidence Creator also request storage commitment for those documents. The purpose of the test is to check the C-Store command/data set (presumed to be easy) and to check the request for Storage Commitment.

References

Instructions

Run these commands/steps from a terminal emulator (Unix) or DOS/Command Window (Windows).

1. Set the current directory to $MESA_TARGET/mesa_tests/rad/actors/evdcrt.

2. Start the MESA servers as described in Starting the MESA Servers above.

3. Clear the MESA Image Manager of existing data in the database:

   perl scripts/reset_servers.pl

4. Transmit (C-Store) one or more Evidence Documents to the MESA Image Manager.

5. Request Storage Commitment of the Image Manager for your Evidence Documents.

If you need to rerun the test, make sure you clear the MESA Image Manager using step 3.

Evaluation

Run this evaluation script:

   perl 301/eval_301.pl <output  level> <AE Title Storage Commit SCU>

You have successfully completed when the logfile (301/grade_301.txt) indicates 0 errors.

2. Submit the logfile 301/grade_301.txt into gazelle as the results for this test.

Supplemental Information

The MESA Image Manager does not automatically send Storage Commitment N-Event reports. If your application would like to receive the appropriate Storage Commitment N-Event report, run this script.

Enter script information.

Test 311: Create/Render Encapsulated PDF

311 is a duplicate of 313. Run 313 instead.

In test 311, the Evidence Creator C-Stores one Encapsulated PDF Evidence Document to the MESA Image Manager. The purpose of the test is to check the C-Store command of the Encapsulated PDF document and to extract the PDF from the DICOM oject.

References

Instructions

Run these commands/steps from a terminal emulator (Unix) or DOS/Command Window (Windows).

1. Set the current directory to $MESA_TARGET/mesa_tests/rad/actors/imgcrt.

2. Start the MESA servers as described in Starting the MESA Servers above.

3. Clear the MESA Image Manager of existing data in the database:

   perl scripts/reset_servers.pl

4. Transmit (C-Store) one Encapsulated PDF Evidence Document to the MESA Image Manager.

5. If you need to rerun the test, make sure you clear the MESA Image Manager using step 3.

Evaluation

1. Run this evaluation script:

   perl 311/eval_311.pl <output  level> 

You have successfully completed when the logfile (311/grade_311.txt) indicates 0 errors and the file 311/311.pdf contains the PDF you encapsulated in your document

2. Submit the logfile 311/grade_311.txt and the file 311/311.pdf into gazelle as the results for this test. Supplemental Information

Test 312: Create/Render SWF Evidence Document

In test 312, the Evidence Creator C-Stores one Evidence Document to the MESA Image Manager. The purpose of the test is to check the C-Store command of the Evidence Document.

This test is designed for workflow profiles (Rad SWF, Eyecare) where the definition of an Evidence Document is not as strict as would be found in a specialized Evidence Documents profile. Evidence documents can be DICOM SR objects, encapsulated PDF documents, screen captures and other DICOM objects. Choose the Evidence Document that is appropriate for your Integration Profile.

References

Instructions

Run these commands/steps from a terminal emulator (Unix) or DOS/Command Window (Windows).

1. Set the current directory to $MESA_TARGET/mesa_tests/rad/actors/imgcrt.

2. Start the MESA servers as described in Starting the MESA Servers above.

3. Clear the MESA Image Manager of existing data in the database:

   perl scripts/reset_servers.pl

4. Transmit (C-Store) one Evidence Document to the MESA Image Manager.

Evaluation

There is no formal evaluation for this script. Once you have successfully stored a DICOM Evidence Document to the MESA Image Manager.

  1. Create a .txt file with the name: YOUR_SYSTEM_NAME_312.txt
  2. Enter notes in the text file to indicate the type of document stored and that you completed the assignment. Include the test number, current date, company name and system name.
  3. Submit the text file into gazelle as the results for this test.

Supplemental Information

Test 313: C-Store Encapsulated PDF

In test 313, the Evidence Creator C-Stores one Encapsulated PDF Evidence Document to the MESA Image Manager. The purpose of the test is to check the C-Store command of the Encapsulated PDF document and to extract the PDF from the DICOM oject.

References

Instructions

Run these commands/steps from a terminal emulator (Unix) or DOS/Command Window (Windows).

1. Set the current directory to $MESA_TARGET/mesa_tests/rad/actors/evdcrt.

2. Start the MESA servers as described in Starting the MESA Servers above.

3. Clear the MESA Image Manager of existing data in the database:

   perl scripts/reset_servers.pl

4. Transmit (C-Store) one Encapsulated PDF Evidence Document to the MESA Image Manager.

5. If you need to rerun the test, make sure you clear the MESA Image Manager using step 3.

Evaluation

1. Run this evaluation script:

   perl 313/eval_313.pl <output  level> 

You have successfully completed when the logfile (313/grade_313.txt) indicates 0 errors and the file 313/313.pdf contains the PDF you encapsulated in your document

2. Submit the logfile 313/grade_313.txt and the file 313/313.pdf into gazelle as the results for this test.

Supplemental Information

Evidence Creator Test 500: Display Calibration

Evidence Creators supporting the Consistent Presentation of Images Integration Profile must calibrate their displays in accordance with DICOM PS 3.14. Instructions for this test are included in the document: Display Consistency Test Plan for Image Creator.


Evidence Creator Test 511: Key Image Note 511

In this test, the Evidence Creator will create a Key Image Note that refers to a single image from a series.

References

Instructions

1. Create/modify the SQL script to identify the Evidence Creator under test.

2. Start the MESA servers as described in Starting the MESA Servers above.

3. The steps below are run from the directory $MESA_TARGET/mesa_tests/rad/actors/imgcrt.

4. Load the data sets into the MESA Image Manager.

   perl 5xx/load_img_mgr.pl

5. Retrieve the study for the patient CRTHREE^PAUL.

6. Create a Key Image Note with the parameters described below. Send the DICOM composite object to the MESA Image Manager.

Evaluation

1. Evaluate the contents of your Key Image Note as follows:

   perl 511/eval_511.pl [-v]

Supplemental Information

If you need to send the note a second time, you should clear the MESA Image Manager first. This will allow the evaluation software to examine your latest object.

   perl  scripts/reset_servers.pl
Template Identifier 2010:DCMR Required
Document Title 113000:DCM:Of Interest Required
HAS CONC MOD CODE 121049:DCM:Language of Content Item and Descendants =

ISO369_2:eng:English

Optional
HAS OBS CONTEXT CODE 121005:DCM:Observer Type = 121006:DCM:Person Required
HAS OBS CONTEXT PNAME 121008:DCM:Person Observer Name = MOORE^STEVE Required
CONTAINS TEXT 113012:DCM:Key Object Description = Key Object Test 511 Required
Image Reference Select the image with Image Number 16 Required


Evidence Creator Test 512: Key Image Note 512

In this test, Evidence Creators will create a Key Image Note that refers to two images from one series.

References

Instructions

1. Create/modify the SQL script to identify the Evidence Creator under test.

2. Start the MESA servers as described in Starting the MESA Servers above.

3. The steps below are run from the directory $MESA_TARGET/mesa_tests/rad/actors/imgcrt.

4. Load the data sets into the MESA Image Manager.

   perl 5xx/load_img_mgr.pl

5. Retrieve the study for the patient CTFIVE^JIM.

6. Create a Key Image Note with the parameters described below. Send the DICOM composite object to the MESA Image Manager.

Evaluation

1. Evaluate the contents of your Key Image Note as follows:

   perl 512/eval_512.pl [-v]

Supplemental Information

If you need to send the note a second time, you should clear the MESA Image Manager first. This will allow the evaluation software to examine your latest object.

   perl  scripts/reset_servers.pl
Template Identifier 2010:DCMR Required
Document Title 113007:DCM:For Patient Required
HAS CONC MOD CODE 121049:DCM:Language of Content Item and Descendants =

ISO369_2:eng:English

Optional
HAS OBS CONTEXT CODE 121005:DCM:Observer Type = 121006:DCM:Person Required
HAS OBS CONTEXT PNAME 121008:DCM:Person Observer Name = MOORE^STEVE Required
CONTAINS TEXT 113012:DCM:Key Object Description = Key Object Test 512 Required
Image Reference Select the images with Image Number 67 and 68 Required

Evidence Creator Test 513: Key Image Note 513

In this test, Evidence Creators will create a Key Image Note that refers to two images; each from a different series.

References

Instructions

1. Create/modify the SQL script to identify the Evidence Creator under test.

2. Start the MESA servers as described in Starting the MESA Servers above.

3. The steps below are run from the directory $MESA_TARGET/mesa_tests/rad/actors/imgcrt.

4. Load the data sets into the MESA Image Manager.

   perl 5xx/load_img_mgr.pl

5. Retrieve the study for the patient MRTHREE^STEVE.

6. Create a Key Image Note with the parameters described below. Send the DICOM composite object to the MESA Image Manager.

Evaluation

1. Evaluate the contents of your Key Image Note as follows:

   perl 513/eval_513.pl [-v]

2. Submit the output of the results into gazelle as the results for this test.

If you need to send the note a second time, you should clear the MESA Image Manager first. This will allow the evaluation software to examine your latest object.

   perl  scripts/reset_servers.pl

Supplemental Information

Table of information to use when creating Key Object Note

Template Identifier 2010:DCMR Required
Document Title 113004:DCM:For Teaching Required
HAS CONC MOD CODE 121049:DCM:Language of Content Item and Descendants =

ISO369_2:eng:English

Optional
HAS OBS CONTEXT CODE 121005:DCM:Observer Type = 121006:DCM:Person Required
HAS OBS CONTEXT PNAME 121008:DCM:Person Observer Name = MOORE^STEVE Required
CONTAINS TEXT 113012:DCM:Key Object Description = Key Object Test 513 Required
Image Reference Select Image 9 from Series 103

Select image 19 from Series 104

Required

Evidence Creator Test 521: Consistent Presentation of Images

This test is for Evidence Creators that support the Consistent Display of Images integration profile. Instructions for this test are found in the document Display Consistency Test Plan for Image Creator.


Evidence Creator Test 552: Example Key Image Note

The goal of this test is to send representative samples to the Project Manager for distribution to other vendors. These samples will be based on tests 511, 512, and 513.

References

Instructions

Either create DICOM Part 10 files with your original DICOM files and submit them (step 2-3) or follow the instructions below.

1. After you complete tests 511, 512, and 513, locate the Key Image notes stored on the MESA Image Manager. These will be in $MESA_STORAGE/imgmgr/instances. The first directory level is the Study Instance UID. You should recognize your Key Image Notes by the Series Instance UID used to identify the next directory.

2. Upload the objects into gazelle under Connectathon -> List of samples.... (Note that when the DICOM objects are uploaded, this may trigger an automatic DICOM validation service. You should take note of the output of the validation.)

3. Create a short txt file indicating you have completed the upload step. Upload that txt file into gazelle as the results for this test.

4. As Image Display actors render your data, you may receive a request for interpretation or directives from the Project Manager to repair attributes. This may prove to be an iterative process.


Evaluation

The evaluation of this test comes in the form of feedback from other users of your data. If other users identify issues with your data, you will be asked to work with those users (and Project Manager) to resolve those issues.

Supplemental Information

Evidence Creator Test 1412: PWF CT

Test 1412 is a test of the steps for Post Processing Workflow in a CT 3D reconstruction scenario. CT images will be created and a 3D reconstruction workitem will be scheduled. As an Evidence Creator, you will be asked to query for the post processing worklist and to claim the scheduled workitem. Subsequent post processing workflow steps are not yet implemented.

References

Instructions

1. Run the active manager test script:

   perl 1412/1412_imgcrt.pl

2. Follow the test instructions

Evaluation

1. Run the evaluation script:

   perl 1412/eval_1412.pl
The evaluation script should yield 0 errors for a successful test.

2. Submit the output of the grade file found in 1412/grade_1412.txt to the Project Manager.

Supplemental Information


Evidence Creator Test 1700: Evidence Document Description

In the Evidence Documents profile, Evidence Documents are defined as DICOM SR objects that are to be used to assist in diagnosis. An example would be measurements on an Ultrasound device.

The purpose of this test is to make sure that Evidence Creators and Acquisition Modalities in the Evidence Documents profile understand the content they are to produce is contained in DICOM SR objects according to the Evidence Document Profile. As mentioned above, the most obvious example are Ultrasound measurements. Another example could be a Mammography CAD file. Evidence Documents (in the Evidence Document Profile) are not images, nor are they DICOM SR Diagnostic Reports.

The instructions below are not a joke. We have had experience with this profile indicating some users do not understand the intent of the Evidence Documents profile.

References

Instructions

1. Create a text file and answer the questions below:

1. List the DICOM SOP class(es) used by your system to generate Evidence Documents. This should be one or more Structured Report Classes.
2. Describe in 100-500 words why your documents are to be considered evidence and are not merely diagnostic reports or other SR objects.
3. Describe in 100-500 words what problems other vendors will have in rendering your document or incorporating your results in a diagnostic report.

2. Name the text file using the convention: CompanyName_1700.txt

3. Submit the text file into gazelle as the results for this 'test'.

Evaluation

Project Manager will read the answers provided to make sure your system is operating in the proper context.

Supplemental Information

Evidence Creator Test 1720: Evidence Document Samples

The purpose of this test is to collect SR objects from all Evidence Creator actors prior to the Connectathon. These vendors/actors are required to submit SR objects for every SR SOP Class supported.

In this “test”, you submit sample SR(s). The purpose is to identify interoperability problems before the Connectathon by distributing these object to Image Display actors for them to render.

  • You should submit at least one example
  • Note that your DICOM objects MUST NOT contain real patient demographics.
  • Note that your SR(s) will be distributed to other vendors.

These files will be used by the Image Display vendors/actors. It is requested that this test, in particular, be completed at least two weeks in advance of the preConnectathon test completion date to allow the Image Display actors to test the display of each of these objects and to allow time for communication if there is a problem.

Instructions

  • Determine a representative set of Structured Reports for your Evidence Creator that would help other actors understand your content. Create samples for each SOP class and template you support
  • Assuming you have this ability, render the SRs you produced as a reference. The goal is for an Image Display actor to examine your rendering and compare to what their software produces. Perform a screen capture and/or save as JPEG or other reasonable format.
  • Upload sample objects and the screen capture snapshot into Gazelle under Connectathon -> List of Samples. On the Samples to share tab, upload your sample image(s) under the "ED" (evidence documents) entry.
  • Create a short txt file indicating you have completed the upload step. Upload that txt file into gazelle as the result file for this test.
  • As Image Display actors render your data, you may receive a request for interpretation or directives from the Project Manager to repair attributes. This may prove to be an iterative process.
  • You may submit more than one set.

Evaluation

The evaluation of this test comes in the form of feedback from vendors who try to display the contents of your objects. If they find issues in displaying the studies, you will be asked to work with those vendors (and the Project Manager) to resolve those issues

Evidence Creator Test 1701: Evidence Document Management in Scheduled Workflow

Test 1701 covers evidence document management in scheduled workflow, part of the Evidence Document profile. The test itself is an implementation of the scheme shown in Figure 14.2-1 (see IHE TF Vol I, section 14).

References

Instructions

1. To run this test, be sure you are in the $MESA_TARGET/mesa_tests/rad/actors/evdcrt directory. Then, execute:

   perl  scripts/imgmgr_swf.pl 1701 

Evaluation

2. To evaluate this test:

   perl  1701/eval_1701.pl

The evaluation script should yield 0 errors for a successful test.

2. Submit the grade file found in 1701/grade_1701.txt into gazelle as the results for this test.

Supplemental Information

Nuclear Medicine Specific Tests

Evidence Creator Test 2800: Result Screen Export Documentation

Evidence Creator test 2800 is a documentation test where the developer documents how the Evidence Creator satisfies the requirements listed in Rad TF-2: 4.18.4.1.2.4

Reference

Rad TF-2: 4.18.4.1.2.4

Instructions

1. Read Rad TF-2: 4.18.4.1.2.4

2. Create a text file labeled: SYSTEM_2800.txt

3. Copy and answer these questions in the text file and upload the file into gazelle as the results for this test.


1. Does your system produce Dyamic Result screens? Are these stored as Multi-Frame Secondary Capture (MFSC) IODs? If not, then you are not following the specifications properly.
2. Does your system produce Static Result screens?
3. Does your system produce Greyscale Result screens?
4. Does your system produce Color Result screens?
5. Does your system produce Multi-Frame Secondary Capture (MFSC) IODs for storing Static Result Screens?
6. Does your system produce DICOM Secondary Capture (SC) IODs for storing Static Result Screens?
7.Does your system present color results? If so, does your system export those results in 24-bit RGB? (If not, then you are not following the specifications properly).
8. List the value stored in Conversion Type (0008 0064)
9. List the value stored in Series Description (0008 103E)
10. List the value stored in Derivation Descripton (0008 2111)

Evaluation

Supplemental Information

Evidence Creator Test 2801: NM Reconstructed Images Special Requirements

Reference

Rad TF-2: 4.18.4.1.2.3

Instructions

In test 2801, reconstructed tomographic datasets are tested for these attributes:

0054 0022 Detector Information Sequence >> 0020 0037 Image Orientation 0018 0088 Spacing Between Slices

Evaluation

Supplemental Information


Evidence Creator Test 2802: NM Reconstructed Images Cardiac Views

Reference

Rad TF-2: 4.18.4.1.2.3

Instructions

In test 2802, reconstructed tomographic datasets are tested for these attributes:

0054 0220 View Code Sequence 0054 0500 Slice Progression Direction 0040 0555 Acquisition Context Sequence

Evaluation

Supplemental Information


Evidence Creator Test 2803: NM Result Export

Reference

Rad TF-2: 4.18.4.1.2.4

Instructions

Evaluation

Supplemental Information

Evidence Creator Test 2804: NM Result Export Screen 1

Test 2804 examines Export Result Screen data for values as defind in Rad TF-2: 4.18.4.1.2.4

Reference

Rad TF-2: 4.18.4.1.2.4

Instructions

1. Produce one or more NM Result Export Screens per Rad TF-2: 4.18.4.1.2.4.

2. Create a DICOM Part 10 file or store in the MIR CTN format (IVRLE, no preamble, no group 0002).

3. Copy/ftp/carrier pigeon the file (or files) to the MESA box/computer.

Evaluation

1. Run the evaluation script for test 2804:

   perl 2804/eval_28-4.pl <log level> FILE1 

2. Submit the evaluation log (2804/grade_2804.txt) into gazelle as the results for this test. Supplemental Information

Log level is a value from 1 to 4 (1 is low, 4 is more messages). When sending the evaluation output to the Project Manager, use a value of 3 or 4.

Cardiology Specific Tests

Evidence Creator Test 20640: Evidence Creation – Cath – Vendor Interoperability

Test 20640 tests the creation and content of an SR with a Cath template.

The purpose of this test is to collect SR object/cath templates from all Evidence Creator actors prior to the Connectathon. These vendors/actors are required to submit SR objects for every Cath template and SR SOP Class supported. The purpose of this test is to collect SR objects from all Evidence Creator actors prior to the Connectathon. These vendors/actors are required to submit SR objects for every SR SOP Class supported.


  • You should submit at least one example
  • Note that your DICOM objects MUST NOT contain real patient demographics.
  • Note that your SR(s) will be distributed to other vendors.

These files will be used by the Image Display vendors/actors. It is requested that this test, in particular, be completed at least two weeks in advance of the pre-Connectathon test completion date to allow the Image Display actors to test the display of each of these objects and to allow time for communication if there is a problem.

Instructions

1. Create DICOM SRs that are representative of the output of your system/product

2. Upload sample objects and the screen capture snapshot into gazelle under Connectathon-->List of samples. Select your system name and add objects under 'Sample to share' for the "ED" (evidence docs) type of sample. (Note that when the DICOM objects are uploaded, this may trigger an automatic DICOM validation service. You should take note of the output of the validation.)

3. Create a short txt file indicating you have completed the upload step. Upload that txt file into gazelle as the result file for this test.

4. As Image Display actors render your data, you may receive a request for interpretation or directives from the Project Manager to repair attributes. This may prove to be an iterative process.

. You may submit more than one set.

Evaluation

The evaluation of this test comes in the form of feedback from vendors who try to display the contents of your objects. If they find issues in displaying the studies, you will be asked to work with those vendors (and the Project Manager) to resolve those issues

Evidence Creator Test 20641: Evidence Creation – Echo – Vendor Interoperability

Test 20641 tests the creation and content of an SR with an Echo template.

The purpose of this test is to collect SR object/echo templates from all Evidence Creator actors prior to the Connectathon. These vendors/actors are required to submit SR objects for every Echo template and SR SOP Class supported. The purpose of this test is to collect SR objects from all Evidence Creator actors prior to the Connectathon. These vendors/actors are required to submit SR objects for every SR SOP Class supported.

In this “test”, you use the MESA tools to submit sample SR(s). The purpose is to identify interoperability problems before the Connectathon by distributing these object to Image Display actors for them to render.

  • You should submit at least one example
  • Note that your DICOM objects MUST NOT contain real patient demographics.
  • Note that your SR(s) will be distributed to other vendors.

These files will be used by the Image Display vendors/actors. It is requested that this test, in particular, be completed at least two weeks in advance of the MESA test completion date to allow the Image Display actors to test the display of each of these objects and to allow time for communication if there is a problem.

Instructions

1. Create DICOM SR(s) that are representative of the output of your system/product

2. Upload sample objects and the screen capture snapshot into gazelle under Connectathon-->List of samples. Select your system name and add objects under 'Sample to share' for the "ED" (evidence docs) type of sample. (Note that when the DICOM objects are uploaded, this may trigger an automatic DICOM validation service. You should take note of the output of the validation.)

3. Create a short txt file indicating you have completed the upload step. Upload that txt file into gazelle as the result file for this test.

4. As Image Display actors render your data, you may receive a request for interpretation or directives from the Project Manager to repair attributes. This may prove to be an iterative process.

. You may submit more than one set.

Evaluation

The evaluation of this test comes in the form of feedback from vendors who try to display the contents of your objects. If they find issues in displaying the studies, you will be asked to work with those vendors (and the Project Manager) to resolve those issues

Evidence Creator Test 20642: Evidence Creation – CTA/MRA – Vendor Interoperability

Test 20642 tests the creation and content of an Enhanced SR with a CT/MR Cardiovascular Analysis Report template.

The purpose of this test is to collect SR objects from all Evidence Creator actors prior to the Connectathon. These vendors/actors are required to submit SR objects for every SR SOP Class supported.

In this “test”, you use the MESA tools to submit sample SR(s). The purpose is to identify interoperability problems before the Connectathon by distributing these object to Image Display actors for them to render.

  • You should submit at least one example
  • Note that your DICOM objects MUST NOT contain real patient demographics.
  • Note that your SR(s) will be distributed to other vendors.

These files will be used by the Image Display vendors/actors. It is requested that this test, in particular, be completed at least two weeks in advance of the pre-Connectathon test completion date to allow the Image Display actors to test the display of each of these objects and to allow time for communication if there is a problem.


Instructions

1. Create DICOM SRs that is representative of the output of your system/product

2. Upload sample objects and the screen capture snapshot into gazelle under Connectathon-->List of samples. Select your system name and add objects under 'Sample to share' for the "ED" (evidence docs) type of sample. (Note that when the DICOM objects are uploaded, this may trigger an automatic DICOM validation service. You should take note of the output of the validation.)

3. Create a short txt file indicating you have completed the upload step. Upload that txt file into gazelle as the result file for this test.

4. As Image Display actors render your data, you may receive a request for interpretation or directives from the Project Manager to repair attributes. This may prove to be an iterative process.

. You may submit more than one set.

Evaluation

The evaluation of this test comes in the form of feedback from vendors who try to display the contents of your objects. If they find issues in displaying the studies, you will be asked to work with those vendors (and the Project Manager) to resolve those issues

Evidence Creator Test 20643: Evidence Creation – Stress Test – Vendor Interoperability

Test 20643 tests the creation and content of an Enhanced SR with a Stress Testing Report template.

The purpose of this test is to collect SR objects from all Evidence Creator actors prior to the Connectathon. These vendors/actors are required to submit SR objects for every SR SOP Class supported.

In this “test”, you use the MESA tools to submit sample SR(s). The purpose is to identify interoperability problems before the Connectathon by distributing these object to Image Display actors for them to render.

  • You should submit at least one example
  • Note that your DICOM objects MUST NOT contain real patient demographics.
  • Note that your SR(s) will be distributed to other vendors.

These files will be used by the Image Display vendors/actors. It is requested that this test, in particular, be completed at least two weeks in advance of the MESA test completion date to allow the Image Display actors to test the display of each of these objects and to allow time for communication if there is a problem.

Instructions

1. Start the MESA servers as described in Starting MESA Servers.

2. Clear the MESA Image Manager.

   perl  scripts/clear_img_mgr.pl

3. C-STORE DICOM objects to the MESA Image Manager.

4. Locate the DICOM objects stored by the MESA Image Manager. These are in $MESA_STORAGE/imgmgr/instances.

5. Upload sample objects and the screen capture snapshot into gazelle under Connectathon-->List of samples. Select your system name and add objects uhder 'Sample to share'. (Note that when the DICOM objects are uploaded, this may trigger an automatic DICOM validation service. You should take note of the output of the validation.)

6. Create a short txt file indicating you have completed the upload step. Upload that txt file into gazelle as the result file for this test.

7. As Image Display actors render your data, you may receive a request for interpretation or directives from the Project Manager to repair attributes. This may prove to be an iterative process.

8. You may submit more than one set.

Evaluation

The evaluation of this test comes in the form of feedback from vendors who try to display the contents of your objects. If they find issues in displaying the studies, you will be asked to work with those vendors (and the Project Manager) to resolve those issues

Supplemental Information

Evidence Creator Test 20700: STRESS Profile -- SOP Class Support

Instructions

There are no test steps to execute for this test. Instead, create a text file which lists all of the SOP classes which your Evidence Creator is capable of creating in the STRESS profile. Your file should have the following naming convention: CompanyName_Product_20700_EC_2008.

Submit the text file into gazelle as the results for this test.

Evidence Creator Test 20740: Stress Profile - Sample Datasets – Vendor Interoperability

The purpose of this test is to collect sample DICOM objects from all Evidence Creator actors in the STRESS profile prior to the Connectathon. You should submit samples for every SOP Class supported. These files will be used by the Image Display vendors/actors. It is requested that this test, in particular, be completed at least two weeks in advance of the MESA test completion date to allow the Image Display actors to test the display of each of these objects and to allow time for communication if there is a problem.

Instructions

1. Start the MESA servers as described in Starting MESA Servers.

2. Clear the MESA Image Manager.

   perl  scripts/clear_img_mgr.pl

3. C-STORE DICOM objects to the MESA Image Manager.

4. Locate the DICOM objects stored by the MESA Image Manager. These are in $MESA_STORAGE/imgmgr/instances.

5. Upload sample objects and the screen capture snapshot into gazelle under Connectathon-->List of samples. Select your system name and add objects uhder 'Sample to share'. (Note that when the DICOM objects are uploaded, this may trigger an automatic DICOM validation service. You should take note of the output of the validation.)

6. Create a short txt file indicating you have completed the upload step. Upload that txt file into gazelle as the result file for this test.

7. As Image Display actors render your data, you may receive a request for interpretation or directives from the Project Manager to repair attributes. This may prove to be an iterative process.

8. You may submit more than one set.

Evaluation

The evaluation of this test comes in the form of feedback from vendors who try to display the contents of your objects. If they find issues in displaying the studies, you will be asked to work with those vendors (and the Project Manager) to resolve those issues

Supplemental Information

Image Fusion (FUS) Profile Tests

Evidence Creators in the Image Fusion Profile must execute the tests in this section. The test images that your Evidence Creator will use are provided in the MESA_STORAGE distribution in the FUSION directory..


Evidence Creator Test 3510: Create and Store Spatial Registration – same FoR

<This test is not yet implemented in the MESA tools.>

In this test, the Evidence Creator will create a Spatial Registration Object that spatially aligns two series of images. These series have the same Frame of Reference. This tests transaction [RAD-56] in the Image Fusion Profile and Storage Commitment [RAD-10].

References

Rad TF-1: 20.4.2 Rad TF-3: 4.56 DICOM PS3.3 – 2004 A.30

Instructions

1. Load the test images for 3510 onto your Evidence Creator. There are two series of images for patient Chestnut^C.

2. Use your application to register the two datasets (registration could be based on fiducials or image content). C-STORE the DICOM Spatial Registration Object to the MESA Image Manager.

3. Request Storage Commitment.

Evaluation

1. Evaluate the contents of your Spatial Registration IOD and Storate Commitment N-ACTION as follows:

   perl 3510/eval_3510.pl  <log_level>

If you need to send the Spatial Registration object a second time, you should clear the MESA Image Manager first. This will allow the evaluation software to examine your latest object.

   perl  scripts/reset_servers.pl

Supplemental Information

Evidence Creator Test 3512: Create and Store Spatial Registration – different FoR

In this test, the Evidence Creator will create a Spatial Registration Object that spatially aligns two series of images received from the MESA image server. These series have different Frames of Reference. The image dataset can be copied to the Evidence Creator from the CD/DVD provided with the MESA tools. Alternatively, it can be C-STOREd using the test script (below) for this test

This tests transaction [RAD-56] in the Image Fusion Profile and Storage Commitment.

Note: The Registration created in this test will be used again in subsequent tests 3514 and 3516.

References

Rad TF-1: 20.4.2 Rad TF-3: 4.56 DICOM PS3.3 – 2004 A.30


Instructions

Create/modify the SQL script to identify the Evidence Creator under test.

1. If you are not using the test script to C-STORE the images to your Evidence Creator, load the test images for 3512 onto your Evidence Creator. There are two series of images for patient ZZ^Z.

2. Make sure the MESA servers have been started as described in the Starting MESA Servers section above.

3. From the $MESA_TARGET/mesa_tests/rad/actors/evdcrt directory, run the following script:

   perl scripts/evdcrt_fusion.pl 3512 <log_level>

Evaluation

1. Evaluate the contents of your Spatial Registration IOD as follows:

    perl 3512/eval_3512.pl <log_level> <Storage Commit AE Title>

If you need to send the Spatial Registration object a second time, you should clear the MESA Image Manager first. This will allow the evaluation software to examine your latest object.

     perl scripts/reset_servers.pl

Supplemental Information

Evidence Creator Test 3514: Create BSPS with Spatial Registration

In this test, the Evidence Creator (or Modality) creates a valid DICOM BSPS object which references a Spatial Registration. You may reference the Spatial Registration Object created in test 3512 or create a new one.

You must store both the BSPS and Spatial Registration objects during this test. You can C-STORE the two objects in any order.

Storage Commitment may be sent, but it will not be evaluated in this test.

References

Rad TF-3: 4.57.4.1.2


Instructions

1. You should already have the test images and Spatial Registration object from test 3512 on your Evidence Creator. If not, the images can be C-STOREd to your Evidence Creator within the test script.

2. Make sure the MESA servers have been started as described in the Starting MESA Servers section above.

3. From the $MESA_TARGET/mesa_tests/rad/actors/evdcrt directory, run the following script:

   perl scripts/evdcrt_fusion.pl 3514 <log_level>

Evaluation

1. Evaluate the contents of your BSPS and Spatial Registration object as follows:

   perl 3514/eval_3514.pl <log_level> <EC C-STORE AE Title>

If you need to send the BSPS and Spatial Registration objects a second time, you should clear the MESA Image Manager first. This will allow the evaluation software to examine your latest objects.

  perl  scripts/reset_servers.pl

Supplemental Information

This test uses the same source PT/CT images from 3512. You might choose to re-use the SPATIAL object you created for 3512.

Evidence Creator Test 3516: Modify Existing Spatial Registration

<This test is not yet implemented in the MESA tools.>

In this test, the Evidence Creator will modify Spatial Registration from test 3512. The Evidence Creator will modify the transformation for one or both datasets and create a new Spatial Registration instance.

Note: Test 3512 must be run before this one.

References

Rad TF-3: 4.56.4.1.12

Instructions

1. The Evidence Creator should locate the Spatial Registration created for test 3512.

2. Use your application to modify the transformation of the dataset(s) - perform a horizontal flip - rotate 90 degrees right

3. Send the new DICOM Spatial Registration object to the MESA Image Manager.

Evaluation

1. Evaluate the contents of your Spatial Registration IOD as follows:

   perl 3516/eval_3516.pl [-v]

If you need to send the Spatial Registration object a second time, you should clear the MESA Image Manager first. This will allow the evaluation software to examine your latest object.

   perl  scripts/reset_servers.pl

Supplemental Information


Evidence Creator Test 3540: Create and Store BSPS IOD – same FoR

In this test, the Evidence Creator will create a Blended Softcopy Presentation State (BSPS) object that specifies how to blend for display the two series of images. These series have the same Frame of Reference. This tests transaction [RAD-57] in the Image Fusion Profile and Storage Commitment [RAD-10]. Spatial Registration is tested in another test.

Note: The BSPS created in this test will be used again in the next test, 3541.

References

Rad TF-1: 20.4.3 Rad TF-3: 4.57 DICOM PS3.3 – 2004 A.30


Instructions

1. Load the test images for 3540 onto your Evidence Creator. There are two series of images for patient Papaya^P. You can also use the test script below to C-STORE these images to your Evidence Creator.

2. Make sure the MESA servers have been started as described in the Starting MESA Servers section above.

3. From the $MESA_TARGET/mesa_tests/rad/actors/evdcrt directory, run the following script:

   perl scripts/evdcrt_fusion.pl 3540 <log_level>

Evaluation

1. Evaluate the contents of your BSPS IOD as follows:

   perl 3540/eval_3540.pl <log_level>  <Storage Commit AE Title>

If you need to send the BSPS object a second time, you should clear the MESA Image Manager first. This will allow the evaluation software to examine your latest object.

   perl  scripts/reset_servers.pl

Supplemental Information

Evidence Creator Test 3541: Modify Existing BSPS

In this test, the Evidence Creator will modify the Blended Softcopy Presentation State (BSPS) object created in test 3540. Test 3540 must be run before this one. The Evidence Creator will modify the transparency of the SUPERIMPOSED dataset and create a new BSPS instance and store a new BSPS object, with Storage Commitment, to the MESA Image Manager.

References

Rad TF-3: 4.57.4.1.2

Instructions

1. The Evidence Creator should locate the Blended Softcopy Presentation State created for test 3540.

2. Make sure the MESA servers have been started as described in the Starting MESA Servers section above.

3. From the $MESA_TARGET/mesa_tests/rad/actors/evdcrt directory, run the following script:

   perl scripts/evdcrt_fusion.pl 3541 <log_level>

Evaluation

1. Evaluate the contents of your BSPS IOD as follows:

   perl 3541/eval_3541.pl <log_level>  <Storage Commit AE Title>

If you need to send the BSPS object a second time, you should clear the MESA Image Manager first. This will allow the evaluation software to examine your latest object.

    perl  scripts/reset_servers.pl

Supplemental Information

Evidence Creator Test 3620: FUS - Example Image, BSPS and Spatial Registration Objects

For the Connectathon, meaningful interoperability testing for the Image Fusion profile will rely largely on the quality of the data sets supplied by the Acquisition Modality and Evidence Creator vendors.

The goal of this “test” is to provide samples for other vendors to display. You should send a “representative sample” of the data produced by your system.

Blended Softcopy Presentation State (BSPS) and Spatial Registration objects are used in the Image Fusion Profile. Both Evidence Creator and Acquisition Modality actors may create and store BSPS and Spatial Registration objects. In addition to BSPS objects, you should also submit the datasets (images, Spatial Registrations) you reference within your BSPS object, even if you did not produce them. That will allow other actors to display the original images with the appropriate BSPS and Spatial Registration objects.

Each system should send samples of the Image, BSPS and Spatial Registration objects. If you create BSPS and/or Spatial Registration objects in the Image Fusion profile, you should also send along the image series the BSPS and Spatial Registration objects reference.

In order to facilitate their testing, please submit your samples 2-3 weeks before the usual test deadlines, and earlier if possible.

References

Instructions

Either create DICOM Part 10 files and submit them (see steps 5-7) or follow the instructions below.

1. Start the MESA servers as described in Starting MESA Servers.

2. Clear the MESA Image Manager (if necessary). From $MESA_TARGET/mesa_tests/rad/actors/mod:

   perl scripts/clear_img_mgr.pl

3. Send sample images/BSPS/Spatial Registration objects to the MESA Image Manager.

4. Assuming you have this ability, render the study you produced as a reference. The goal is for a consumer of your images to examine your rendering and compare to what their software produces. Perform a screen capture and/or save as JPEG or other reasonable format.

5. Upload sample objects (DICOM Part 10 files) and the screen capture snapshot into gazelle under Connectathon-->List of samples. Select your system name and add objects uhder 'Sample to share'. (Note that when the DICOM objects are uploaded, this may trigger an automatic DICOM validation service. You should take note of the output of the validation.)

6. Create a short txt file indicating you have completed the upload step. Upload that txt file into gazelle as the result file for this test.

7. As Image Display actors import your images, you may receive a request for interpretation or directives from the Project Manager to repair attributes. This may prove to be an iterative process.

Evaluation

The evaluation of this test comes in the form of feedback from other users of your data. If other users identify issues with your data, you will be asked to work with those users (and Project Manager) to resolve those issues.

Supplemental Information

Mammography Image Profile Specific Tests

Evidence Creator Test 3916: Sample Mammo CAD SR Objects - Vendor Interoperability

In this “test”, you submit a sample DICOM Mammography CAD SR and the associated image objects. The purpose is to identify interoperability problems before the Connectathon by distributing these object to Image Display actors for them to render.

  • You should create at least one example
  • You must also submit any 'For Presentation' images that are referenced by the Mammography CAD SR instance.
  • Finally, you must subit any 'For Processing' SOP instances referenced in the Source Image Sequence of the 'For Presentation' images.
  • Note that your DICOM objects MUST NOT contain real patient demographics.
  • Note that your SR(s) will be distributed to other vendors.

If possible, encode as many attributes identified in RAD TF-2:4.16.4.2.2.1.1.8 to enable Image Displays to be thoroughly exercised.

Instructions

  • Assuming you have this ability, render the SR(s) you produced as a reference. The goal is for a consumer of your SR to examine your rendering and compare to what their software produces. Perform a screen capture and/or save as JPEG or other reasonable format.
  • Create a short txt file indicating you have completed the upload step. Upload that txt file into gazelle as the result file for this test.
  • As Image Display actors render your data, you may receive a request for interpretation or directives from the Project Manager to repair attributes. This may prove to be an iterative process.
  • You may submit more than one set.

Evaluation

The evaluation of this test comes in the form of feedback from vendors who try to display the contents of your objects. If they find issues in displaying the studies, you will be asked to work with those vendors (and the Project Manager) to resolve those issues

Mammography Acquisition Workflow Profile Specific Tests

Evidence Creator Test 4102: MAWF Vendor Interoperability - View Correction

For the Connectathon, meaningful interoperability testing for the MAWF profile will rely largely on the quality of the data sets supplied by the Acquisition Modality and Evidence Creator vendors.

In this “test”, you create a sample data set for the View Correction case that will be reviewed by other participants. The goal of this test is to prepare other actors (Image Manager and Image Displays) so they are not surprised during Connectathon events. In order to facilitate their testing, please submit your samples 2-3 weeks before the usual test deadlines.

References

RAD TF-2:4.18.4.1.2.5

RAD TF-2:4.66.4.2.2

Instructions

A1. Obtain a mammography study for with incorrect view encoded; include both FOR PROCESSING and FOR PRESENTATION images. Then we want you to simulate a "View Correction" case and create images which encode the correct view information -- consult the Technical Framework references above.

A2. Create a KOS with the title "Rejected for Patient Safety Reasons" which references the incorrect images.

B. Place the images (original and corrected) and the KOS in DICOM Part 10 files using your own tools or follow the numbered instructions below.

C. Assuming you have this ability, render the images you produced as a reference. The goal is for a consumer of your images to examine your rendering and compare to what their software produces. Perform a screen capture and/or save as JPEG or other reasonable format.

D. Create a short txt file indicating you have completed the upload step. Upload that txt file into gazelle as the result file for this test.

E. Create a short txt file indicating you have completed the upload step. Upload that txt file into gazelle as the result file for this test.

F. As Image Display actors import your images, you may receive a request for interpretation or directives from the Project Manager to repair attributes. This may prove to be an iterative process.

Follow these steps if you need to use the MESA tools to create DICOM files


1. Start the MESA servers as described in Starting MESA Servers.

2. Clear the MESA Image Manager.

   perl  scripts/clear_img_mgr.pl

3. C-STORE DICOM objects to the MESA Image Manager.

4. Locate the DICOM objects stored by the MESA Image Manager. These are in $MESA_STORAGE/imgmgr/instances.

5. Upload the objects into gazelle under Connectathon -> List of samples....

6. Create a short txt file indicating you have completed the upload step. Upload that txt file into gazelle as results for this test.

8. You may submit more than one sample.

Evaluation

The evaluation of this test comes in the form of feedback from vendors who try to import/process/display the contents of your objects. If they find issues, you will be asked to work with those vendors (and the Project Manager) to resolve those issues

Supplemental Information

Test 4650: DIFF Vendor Interoperability - Exchange Enhanced MR Image Samples

In this “test”, you create a sample data set that will be reviewed by other participants. The goal of this test is to prepare other actors (Image Manager, Image Display) so they are not surprised during Connectathon events. In order to facilitate their testing, please submit your samples as early as possible in the pre-Connectathon period, at the latest 2-3 weeks before the usual test deadlines.

References RAD TF-2: 4.18.4.1.2.5

Instructions

A. Determine a representative set of derived images (ADC and Isotropic) for your Evidence Creator that would help other actors understand your content. Good samples make for meaningful tests between vendors.

B. Place the image(s)in DICOM Part 10 files using your own tools or follow the numbered instructions below.

C. Assuming you have this ability, render the image(s) you produced as a reference. The goal is for a consumer of your images to examine your rendering and compare to what their software produces. Perform a screen capture and/or save as JPEG or other reasonable format.

D. Upload sample objects and the screen capture snapshot into gazelle under Connectathon-->List of samples. Select your system name and add objects uhder 'Sample to share'. (Note that when the DICOM objects are uploaded, this may trigger an automatic DICOM validation service. You should take note of the output of the validation.)


E. Create a short txt file indicating you have completed the upload step. Upload that txt file into gazelle as the result file for this test.

F. As Image Display and Image Manager actors import your data, you may receive a request for interpretation or directives from the Project Manager to repair attributes. This may prove to be an iterative process.

Follow these steps if you need to use the MESA tools to create DICOM files


1. Start the MESA servers as described in Starting MESA Servers.

2. Clear the MESA Image Manager.

   perl  scripts/clear_img_mgr.pl

3. C-STORE DICOM objects to the MESA Image Manager.

4. Locate the DICOM objects stored by the MESA Image Manager. These are in $MESA_STORAGE/imgmgr/instances. Tar or zip these files

5. Upload sample objects and the screen capture snapshot into gazelle under Connectathon-->List of samples. Select your system name and add objects uhder 'Sample to share'. (Note that when the DICOM objects are uploaded, this may trigger an automatic DICOM validation service. You should take note of the output of the validation.)

6. Create a short txt file indicating you have completed the upload step. Upload that txt file into gazelle as the result file for this test.

8. You may submit more than one sample.

Evaluation

The evaluation of this test comes in the form of feedback from vendors who try to import/process/display the contents of your objects. If they find issues, you will be asked to work with those vendors (and the Project Manager) to resolve those issues

Supplemental Information


Eye Care Specific Tests

Test 281: Example Images and other DICOM objects

Test 281 is used to collect sample images, Key Object Notes, Evidence Documents and/or other DICOM composite objects produced by an Evidence Creator. The intent of the test is to send DICOM composite objects (DICOM Part 10 format) to the Connectathon Manager for redistribution to other participants. This will allow them time to test/examine your data before an in-person meeting.

Please submit your samples 2 WEEKS before the normal deadlines. This will give the other systems a chance to review your data.

Instructions

1. Produce DICOM Part 10 files for the types of Evidence Documents your system system produces. If you produce more than one type of Evidence Document, produce at least one DICOM Part 10 file for each document type.

2. Assuming you have this ability, render the images/objects you produced as a reference. The goal is for an Image Display actor to examine your rendered images and compare to what their software produces. Perform a screen capture and/or save as JPEG or other reasonable format.

3. Upload DICOM files and the screen capture snapshot onto ftp.aao.org. Contact the connectathon manager to obtain the username/password.

4. Create a short txt file indicating you have completed the upload step. Upload that txt file into gazelle as the result file for this test.

5. As Image Display and Image Manager actors render your data, you may receive a request for interpretation or directives from the Project Manager to repair attributes. This may prove to be an iterative process.


Evaluation

Evaluation of this test occurs when other participants review your data. In the event that other participants find errors/issues, you may be asked to modify your data.

Test 50216: ECED Example Document

In this “test”, you provide some Eye Care Evidence Document (ECED) samples for review by other vendors.

References

Instructions

  1. Create one or more sample ECED objects. Store these in DICOM Part 10 format.
  2. You can create a DICOM Part 10 CD with a DICOMDIR file or just create DICOM Part 10 files without the DICOMDIR. If you need help creating DICOM Part 10 files, contact the Connectathon Manager.
  3. Create a short .txt file that describes the data you have created. You do not need to write a book, but something to help the other systems.
  4. If you can, provide a sample rendering of the document as a JPEG or HTML file.
  5. Place the DICOM Part 10 files, text description and sample rendering in a zip file. Name the zip file SYSTEM_NAME_50216.zip where SYSTEM_NAME refers to the name assigned to your system in the Kudu or Gazelle tool. It is not your company product name.
  6. For the Eye Care connectathon: Upload this file onto ftp://ftp.aao.org along with the screen shots into the folder for Test 50216.
  7. As the test results for this test, submit a txt file into gazelle indicating you have uploaded your sample.
  8. As other actors render your data, you may receive a request for interpretation or directives from the Project Manager to repair attributes. This may prove to be an iterative process.

Evaluation

Evaluation is provided by other systems as they render your samples.

Supplemental Information

Test 50217: ECDR Example Report

In this “test”, you provide some Eye Care Displayable Report (ECDR) samples for review by other vendors.

References

Instructions

  1. Create one or more sample ECDR objects. Store these in DICOM Part 10 format.
  2. You can create a DICOM Part 10 CD with a DICOMDIR file or just create DICOM Part 10 files without the DICOMDIR. If you need help creating DICOM Part 10 files, contact the Connectathon Manager.
  3. Create a short .txt file that describes the data you have created. You do not need to write a book, but something to help the other systems.
  4. If you can, provide a sample rendering of the report as a JPEG or HTML file.
  5. Place the DICOM Part 10 files, text description and sample rendering in a zip file. Name the zip file SYSTEM_NAME_50217.zip where SYSTEM_NAME refers to the name assigned to your system in the Kudu or Gazelle tool. It is not your company product name.
  6. For the Eye Care connectathon: Upload this file onto ftp://ftp.aao.org along with the screen shots into the folder for Test 50217.
  7. As the test log for this test, submit a txt file into gazelle indicating you have uploaded your sample.
  8. As other actors render your data, you may receive a request for interpretation or directives from the Project Manager to repair attributes. This may prove to be an iterative process.

Evaluation

Evaluation is provided by other systems as they render your samples.

Supplemental Information