MESA/Portable Media Creator

From IHE Wiki
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

MESA/Portable Media Creator

Portable Media Creator Tests

Integration Profiles and Test Procedures

This document lists a number of tests for Media Creator Systems. You may not be responsible for all of these tests.
Please refer to the Connectathon web tool to list the required tests for your system. The web address of this tool depends on the year and project manager. Please contact the appropriate project manager to obtain this information.


Test Cases: PDI

This section describes test cases that are generally associated with the PDI Integration Profile. There may be some overlap with other profiles.


Test Case 1901: Media Creator Mount Point

Test 1901 is used to make sure we have the proper mount point for the removable media. On Windows systems, this will be something like D:\ or E:\. On Linux or other varieties of Unix, it will be something like this:

/mnt/cdrom

There are two implementations of test 1901. The preferred implementation is the RSNA 2005 PDI Software (runs on Windows only). The deprecated implementation is the original MESA tool.

References

RAD TF-3: 4.47.4.1.2

Instructions for 1901 (Implementation 1): RSNA 2005 PDI Software

  1. You will need the PDI Media Creator software available on the MESA Software distribution page - http://ihedoc.wustl.edu/mesasoftware/index.htm .
  2. Then access the link for the MESA software for this year's connectathon.
  3. Start the RSNA 2005 PDI Software (desktop icon or Programs->RSNA PDI Media Tester -> RSNA PDI
  4. Select and execute test 1901

Evaluation (for Implementation 1)

  1. When you complete the test, look at the bottom left hand corner to see the location where the log files are written. Go to that directory and retrieve grade_pdi_media.txt and error_pdi_media.txt. Upload a zip of these files along with a screen capture of RSNA PDI Media Tester as your results for this test.

Instructions for 1901 (Implementation 2): MESA Command Line

To run this test, follow these steps using a DOS window or terminal emulator:

  1. Set the current directory to $MESA_TARGET/mesa_tests/rad/actors/media_crt.
  2. Mount your disk on the MESA system. Determine the mount point.

Evaluation (for Implementation 2)

  1. Run the evaluation script
     perl  1901/eval_1901.pl 4 <mount point>
  1. The output file is 1901/grade_1901.txt. This test is successfully completed when the last line in the output file indicates 0 errors.
  2. Upload the grade file as the results for this test.

Test Case 1902: Media Creator General Disk Format

This test is not ready with this release of software.
In test 1902, the media is tested for general format. This is a test of the media file system (conformance with ISO 9660 Level 1).

References

RAD TF-3: 4.47.4.1.2.1

Instructions

To run this test, follow these steps using a DOS window or terminal emulator:

  1. Set the current directory to $MESA_TARGET/mesa_tests/rad/actors/media_crt.
  2. Mount your disk on the MESA system. Determine the mount point.

Evaluation

To evaluate your response to this test:

  1. Run the evaluation script
     perl  1902/eval_1902.pl 4 <mount point>
  1. The output file is 1902/grade_1902.txt. This test is successfully completed when the last line in the output file indicates 0 errors.
  2. Upload the grade file as the results for this test. The Project Manager may examine the file and require you to modify your query and/or to perform additional queries.

Supplemental Information

  1. The evaluation script takes a single argument, <log level>. When debugging output, it is sometimes helpful to use a log level of 1 to see only differences. When submitting results, we prefer the output with the most verbose level, 4.

Test Case 1903: Media Creator File Conventions

Test 1903 examines the media for general file conventions. This includes naming conventions and finding DICOM instances in the root directory.


References

RAD TF-3 4.47

Instructions

To run this test, follow these steps using a DOS window or terminal emulator:

  1. Set the current directory to $MESA_TARGET/mesa_tests/rad/actors/media_crt.
  2. Mount your disk on the MESA system. Determine the mount point.

Evaluation

To evaluate your response to this test:

  1. Run the evaluation script
     perl  1903/eval_1903.pl 4 <mount point>
  1. The output file is 1903/grade_1903.txt. This test is successfully completed when the last line in the output file indicates 0 errors.
  2. Upload the grade file as results for this test. The Project Manager may examine the file and require you to modify your query and/or to perform additional queries.


Supplemental Information

  1. The evaluation script takes a single argument, <log level>. When debugging output, it is sometimes helpful to use a log level of 1 to see only differences. When submitting results, we prefer the output with the most verbose level, 4.

Test Case 1904: Media Creator DICOMDIR content

Test 1904 examines the content of the DICOMDIR file. A number of different tests are run on this file.

References

RAD TF-3 4.47

Instructions

To run this test, follow these steps using a DOS window or terminal emulator:

  1. Set the current directory to $MESA_TARGET/mesa_tests/rad/actors/media_crt.
  2. Mount your disk on the MESA system. Determine the mount point.

Evaluation

To evaluate your response to this test:

  1. Run the evaluation script
     perl  1904/eval_1904.pl 4 <mount point>
  1. The output file is 1904/grade_1904.txt. This test is successfully completed when the last line in the output file indicates 0 errors.
  2. Upload this file as the results for this test. The Project Manager may examine the file and require you to modify your query and/or to perform additional queries.


Supplemental Information

  1. The evaluation script takes a single argument, <log level>. When debugging output, it is sometimes helpful to use a log level of 1 to see only differences. When submitting results, we prefer the output with the most verbose level, 4.

Test Case 1905: Media Creator Object Contents

Test 1905 examines the content of the DICOM instance files. A number of different tests are run on each file.

References

Rad TF-3 4.47

Instructions

To run this test, follow these steps using a DOS window or terminal emulator:

  1. Set the current directory to $MESA_TARGET/mesa_tests/rad/actors/media_crt.
  2. Mount your disk on the MESA system. Determine the mount point.

Evaluation

To evaluate your response to this test:

  1. Run the evaluation script
     perl  1905/eval_1905.pl 4 <mount point>
  1. The output file is 1905/grade_1905.txt. This test is successfully completed when the last line in the output file indicates 0 errors.
  2. Upload the grade file as results for this test. The Project Manager may examine the file and require you to modify your query and/or to perform additional queries.


Supplemental Information

  1. The evaluation script takes a single argument, <log level>. When debugging output, it is sometimes helpful to use a log level of 1 to see only differences. When submitting results, we prefer the output with the most verbose level, 4.

Test Case 1910: Media Creator Basic Web Content

Test 1910 examines media supporting the Web option. It tests for requirements listed in that option.


References


Instructions

To run this test, follow these steps using a DOS window or terminal emulator:

  1. Set the current directory to $MESA_TARGET/mesa_tests/rad/actors/media_crt.
  2. Mount your disk on the MESA system. Determine the mount point.

Evaluation

To evaluate your response to this test:

  1. Run the evaluation script
     perl  1910/eval_1910.pl 4 <mount point>
  1. The output file is 1910/grade_1910.txt. This test is successfully completed when the last line in the output file indicates 0 errors.
  2. Upload the grade file as results for this test. The Project Manager may examine the file and require you to modify your query and/or to perform additional queries.


Supplemental Information

  1. The evaluation script takes a single argument, <log level>. When debugging output, it is sometimes helpful to use a log level of 1 to see only differences. When submitting results, we prefer the output with the most verbose level, 4.

Test Case 1911: Media Creator Naming Conflicts

Test 1911 examines to determine if there are naming conflicts in violation of the rules listed for media creators.

References


Instructions

To run this test, follow these steps using a DOS window or terminal emulator:

  1. Set the current directory to $MESA_TARGET/mesa_tests/rad/actors/media_crt.
  2. Mount your disk on the MESA system. Determine the mount point.

Evaluation

To evaluate your response to this test:

  1. Run the evaluation script
     perl  1911/eval_1911.pl 4 <mount point>
  1. The output file is 1911/grade_1911.txt. This test is successfully completed when the last line in the output file indicates 0 errors.
  2. Upload the grade file as the results for this test. The Project Manager may examine the file and require you to modify your query and/or to perform additional queries.


Supplemental Information

  1. The evaluation script takes a single argument, <log level>. When debugging output, it is sometimes helpful to use a log level of 1 to see only differences. When submitting results, we prefer the output with the most verbose level, 4.

Test Case 1912: Media Creator Prefix Conflicts

Test 1912 examines to determine if there are prefix conflicts in violation of the rules listed for media creators. Files/folders are not allowed to start with certain prefixes.


References


Instructions

To run this test, follow these steps using a DOS window or terminal emulator:

  1. Set the current directory to $MESA_TARGET/mesa_tests/rad/actors/media_crt.
  2. Mount your disk on the MESA system. Determine the mount point..

Evaluation

To evaluate your response to this test:

  1. Run the evaluation script
     perl  1912/eval_1912.pl 4 <mount point>
  1. The output file is 1912/grade_1912.txt. This test is successfully completed when the last line in the output file indicates 0 errors.
  2. Upload the grade file as the results for this test. The Project Manager may examine the file and require you to modify your query and/or to perform additional queries.

Supplemental Information

The evaluation script takes a single argument, <log level>. When debugging output, it is sometimes helpful to use a log level of 1 to see only differences. When submitting results, we prefer the output with the most verbose level, 4


Test Case 1945: SOP Classes on PDI media

Create a text file that lists the DICOM SOP classes /Transfer Syntax which you application is capable of storing to PDI media. Upload that file as the result log for this test. (Alternatively, you may submit a DICOM conformance statement which contains this information.)

The purpose of this 'test' is to help us understand your Portable Media Creator actor in the context of your product. For example, if you are an acquisition modality, you may create PDI media containing only one or two DICOM SOP classes; if you are a PACS workstation, you may be able to write many different kinds of images and SRs to PDI media.

Test Case 1950: Submit sample media containing Basic Viewer

If you can submit sample media containing a Basic Viewer in advance of the connectathon, you may have an opportunity for an evaluator to take a look at your application and give you feedback. We understand that you may still be in development, so if all requirements are not yet met, you may still submit a sample.

When you are ready to submit a Basic Viewer on PDI media for early evaluation, create a text file indicating this, along with your name and email address. Upload that file as the log returned for this test. That will prompt the Project Manager to contact you and arrange to collect the sample media.

Test Case 1951: Submit performance with BIR small dataset

This test gathers performance numbers for the Basic Viewer on PDI media. Only PMC actors that support the Basic Viewer option. You may not have achieved all of the required performance targets when you submit this test; that is OK during pre-connectathon testing. Submit the performance numbers you are getting now; we just want to make sure your are testing performance in your lab and are working towards the targets.

References

Instructions

  1. Obtain the small dataset from the IHE wiki. See link in the References section.
  2. Create a CD containing the large dataset and your Basic Viewer application Rad RAD TF-1: 15.6.5 (in the BIR profile) to understand the test conditions.
  3. Create a text file or Word document with the following format to document tne platform you are testing on:
    • gazelle system name:
    • Date:

    Reference Platform:

    • Processor -
    • RAM -
    • Hard Drive -
    • Video -
    • Operating System -
    • Frameworks -
    • Display System -
  4. Append to the file actual measurements for the following Benchmarks documented in RAD TF-1:15.6.6-2
    Actual performance:
    • Start-up to ready to select -
    • Series thumbnail select to first image of series displayed -
    • Series thumbnail to last image of series displayed -
    • Select different series -
    • Scroll frame rate -
    • Zoom -
    • Window cetner and width chagne response time -
  5. Perform measurements the small dataset and following the guidance in RAD TF-1: 15.6.5. Although the profile requires repeating the test for each benchmark at least 4 times, we are only asking for one measurement here.
  6. Save the performance results, and rename the file to be <your_company_name>-BIR-performance-small.txt
  7. When you are done testing performance, upload the file containing your platform and measurement details into kudua as the results for this test.

Test Case 1952: Submit performance with BIR large dataset

This test gathers performance numbers for the Basic Viewer on PDI media. Only PMC actors that support the Basic Viewer option. You may not have achieved all of the required performance targets when you submit this test; that is OK during pre-connectathon testing. Submit the performance numbers you are getting now; we just want to make sure your are testing performance in your lab.

References

Instructions

  1. Obtain the large dataset from the IHE wiki. See link in the References section.
  2. Create a CD containing the large dataset and your Basic Viewer application Rad RAD TF-1: 15.6.5 (in the BIR profile) to understand the test conditions.
  3. Create a text file or Word document with the following format to document tne platform you are testing on:
    • Kudu system name:
    • Date:

    Reference Platform:

    • Processor -
    • RAM -
    • Hard Drive -
    • Video -
    • Operating System -
    • Frameworks -
    • Display System -
  4. Append to the file actual measurements for the following Benchmarks documented in RAD TF-1:15.6.6-2
    Actual performance:
    • Start-up to ready to select -
    • Series thumbnail select to first image of series displayed -
    • Series thumbnail to last image of series displayed -
    • Select different series -
    • Scroll frame rate -
    • Zoom -
    • Window cetner and width chagne response time -
  5. Perform measurements the small dataset and following the guidance in RAD TF-1: 15.6.5. Although the profile requires repeating the test for each benchmark at least 4 times, we are only asking for one measurement here.
  6. Save the performance results, and rename the file to be <your_company_name>-BIR-performance.txt
  7. When you are done testing performance, upload the file containing your platform and measurement details into kudu as the results for this test.


Test Case 4700: BIR: Basic Viewer on PDI media Connectathon tests

This test has moved here: https://gazelle.ihe.net/content/birtestdata

Test Case 13501: XDM Media Creator Mount Point

DEPRECATED

Test Case 13502: XDM Media General Disk Format

Instructions

Test 13502 is a duplicate of test 1902. Execute tests 1902 and submit those results as if they were for test 13502.

Test Case 13511: XDM Media Creator File Conventions

Test 13511 examines the file names and folder structure of media created by the Media Creator in the XDM Integration Profile

References

ITI TF-2 3.32.4.1.2

Instructions

  1. Create media to be exported by the XDM profile (USB/CDR/ZIP). This can be with any combination of submission sets and documents per submission set. If a zip file, unzip the file, honoring the folder structure you created.
  2. Set the current directory to $MESA_TARGET/mesa_tests/iti/actors/media_crt.
  3. Determine the mount point/folder name for the data and run the evaluation script below.

Evaluation

To evaluate your response to this test:

  1. Run the evaluation script
         perl  13511/eval_13511.pl 4 <mount point/folder name>
  2. The output file is 13511/grade_13511.txt. This test is successfully completed when the last line in the output file indicates 0 errors.
  3. Submit the grade file as the results for this test.


Supplemental Information

The features of the files/folders that are tested are:

  1. README.TXT is present at the root level.
  2. INDEX.HTM is present at the root level and contains properly formatted XHTML
  3. The root directory contains a folder IHE_XDM.
  4. IHE_XDM contains at least one subfolder entry for the contents of the submission set: eg SUBSET01 --the folder name itself is not prescribed but it must follow the conventions in ITI-TF2:3.32.4.1.2.1
  5. All submission set subfolders (eg SUBSET01) are tested as follows:
    • The file METADATA.XML is present.
    • The folders contain a single file referenced by the METADATA.XML for a single document, or a subfolder with 2 or more documents.for multi part documents

Test Case 13512: XDM Media Creator Metadata Schema Validation

Test 13512 is equivalent to test 11720. Perform test 11720 and submit the results as if you were submitting them for 13512. The test description for 11720 is found, along with the other XDS tests, on NIST's XDS test website.


References


Instructions

  1. Generate data to be exported (USB/CDR/ZIP). Extract the METADATA.XML file (or files) from the data.

Evaluation

  1. Evaluate the XDS metadata for correct syntax/composition using test 11720.


Supplemental Information

Test Case 13513: XDM Media Creator Metadata Schematron Test

Test 13513 uses Schematron to examine the metadata produced for one or more exported media. The Schematron test is designed to look for specific values in the metadata that are required but are not tested as part of schema validation.

References


Instructions

To run this test, follow these steps using a DOS window or terminal emulator:

  1. Create media to be exported by the XDM profile (USB/CDR/ZIP). This can be with any combination of submission sets and documents per submission set. Place the METADATA.XML file to be evaluated on the MESA computer (ftp, rcp, etc.)
  2. Set the current directory to $MESA_TARGET/mesa_tests/iti/actors/media_crt.
  3. Run the evaluation script below.

Evaluation

To evaluate your response to this test:

  1. Run the evaluation script
         perl  13513/eval_13513.pl 4 <path to METADATA.XML>
  2. The output file is 13513/grade_13513.txt. This test is successfully completed when the last line in the output file indicates 0 errors.
  3. Submit the grade file as results for this test.


Supplemental Information

Test Case 13514: XDM Media Creator, One Submission Set, Single Part Doc

In test 13514, the Media Creator creates data for export in the XDM profile with one submission set and a single document.

References

ITI TF-2:3.32.4.1.2.2

Instructions

To run this test, follow these steps using a DOS window or terminal emulator:

  1. Create media to be exported by the XDM profile (USB/CDR/ZIP). This data should contain one (and only one) submission set with one (and only one) document. If a zip file, unzip the file, honoring the folder structure you created.
  2. Set the current directory to $MESA_TARGET/mesa_tests/iti/actors/media_crt.
  3. Determine the mount point/folder name for the data and run the evaluation script below.

Evaluation

To evaluate your response to this test:

  1. Run the evaluation script
         perl  13514/eval_13514.pl 4 <mount point/folder name>
  2. The output file is 13514/grade_13514.txt. This test is successfully completed when the last line in the output file indicates 0 errors.
  3. Submit the grade file as the restuls for this test.

Supplemental Information The features of the files/folders that are tested are:

  1. README.TXT is present at the root level.
  2. INDEX.HTM is present at the root level and contains properly formatted XHTML
  3. The root directory contains a folder IHE_XDM.
  4. IHE_XDM contains one subfolder for the contents of the submission set; the folder name itself is not prescribed but it must follow the conventions in ITI-TF2:3.32.4.1.2.1
  5. The subfolder must contain:
    • the file METADATA.XML
    • one file that is single document; file name must follow the conventions in ITI TF-2:3.32.4.1.2.1

Test Case 13515: XDM Media Creator, One Submission Set, Multi Part Doc

In test 13515, the Media Creator creates data for export in the XDM profile with one submission set containing a multi-part document. If your application cannot create a multi-part document, upload a note into gazelle indicating that as the result for this test.


References

ITI TF-2:3.32.4.1.2.2

Instructions

To run this test, follow these steps using a DOS window or terminal emulator:

  1. Create media to be exported by the XDM profile (USB/CDR/ZIP). This data should contain one (and only one) submission set with a multi-part document. If a zip file, unzip the file, honoring the folder structure you created.
  2. Set the current directory to $MESA_TARGET/mesa_tests/iti/actors/media_crt.
  3. Determine the mount point/folder name for the data and run the evaluation script below.

Evaluation

To evaluate your response to this test:

  1. Run the evaluation script
         perl  13515/eval_13515.pl 4 <mount point/folder name>
  2. The output file is 13515/grade_13515.txt. This test is successfully completed when the last line in the output file indicates 0 errors.
  3. Submit the grade file as the results for this test.


Supplemental Information

The features of the files/folders that are tested are:

  1. README.TXT is present at the root level.
  2. INDEX.HTM is present at the root level and contains properly formatted XHTML
  3. The root directory contains a folder IHE_XDM.
  4. IHE_XDM contains one subfolder for the contents of the submission set; the folder name itself is not prescribed but it must follow the conventions in ITI-TF2:3.32.4.1.2.1
  5. The subfolder must contain:
    • the file METADATA.XML
    • a subfolder which contains the multi part document; there may be 2 or more documents in this folder. Their file names must follow the conventions in ITI TF-2:3.32.4.1.2.1

Test Case 13516: XDM Media Creator, Two Submission Sets, Single Part Doc

In test 13516, the Media Creator creates data for export in the XDM profile with two submission sets, each with a single document.If your application cannot create a two submission sets as required by this test, upload a note into gazelle indicating that as the result for this test.

References

ITI TF-2:3.32.4.1.2.2

Instructions

To run this test, follow these steps using a DOS window or terminal emulator:

  1. Create media to be exported by the XDM profile (USB/CDR/ZIP). This data should contain exactly 2 submission sets, each with one (and only one) document. If a zip file, unzip the file, honoring the folder structure you created.
  2. Set the current directory to $MESA_TARGET/mesa_tests/iti/actors/media_crt.
  3. Determine the mount point/folder name for the data and run the evaluation script below.

Evaluation

To evaluate your response to this test:

  1. Run the evaluation script
         perl  13516/eval_13516.pl 4 <mount point/folder name>
  2. The output file is 13516/grade_13516.txt. This test is successfully completed when the last line in the output file indicates 0 errors.
  3. Submit the grade file as the results for this test.


Supplemental Information

The features of the files/folders that are tested are:

  1. README.TXT is present at the root level.
  2. INDEX.HTM is present at the root level and contains properly formatted XHTML
  3. The root directory contains a folder IHE_XDM.
  4. IHE_XDM contains two subfolders for the contents of the submission sets.
  5. The two subfolders each contain the file METADATA.XML
  6. The two subfolders each contain one file that is single document referenced by the METADATA.XML

Test Case 13517: XDM Media Creator, Two Submission Sets, Multi Part Doc

In test 13517, the Media Creator creates data for export in the XDM profile with two submission sets, each with a multi-part document.


References

ITI TF-2:3.32.4.1.2.2

Instructions

To run this test, follow these steps using a DOS window or terminal emulator:

  1. Create media to be exported by the XDM profile (USB/CDR/ZIP). This data should contain two submission sets, each with a multi-part document. If it is a zip file, unzip the file, honoring the folder structure you created.
  2. Set the current directory to $MESA_TARGET/mesa_tests/iti/actors/media_crt.
  3. Determine the mount point/folder name for the data and run the evaluation script below.

Evaluation

To evaluate your response to this test:

  1. Run the evaluation script
         perl  13517/eval_13517.pl 4 <mount point/folder name>
  2. The output file is 13517/grade_13517.txt. This test is successfully completed when the last line in the output file indicates 0 errors.
  3. Submit the grade file as the results for this test.


Supplemental Information The features of the files/folders that are tested are:

  1. README.TXT is present at the root level.
  2. INDEX.HTM is present at the root level and contains properly formatted XHTML
  3. The root directory contains a folder IHE_XDM.
  4. IHE_XDM contains two subfolders, one each for the contents of two submission sets
  5. Each of the two subfolders (submission sets) must contain:
    • the file METADATA.XML
    • another subfolder which contains the multi part document (ie 2 or more documents)

Test Case 13520: XDM Media Creator, ATNA Logging

In test 13520, the Media Creator generates an ATNA log message that indicates PHI was exported. The Media Creator will create media for a specific patient so that the log message can be properly evaluated.

The Media Creator shall not use the IHE Year 4 provisional schema. This test will specify the required log message.

References

ITI TF-2:3.32.4.1.4

ITI TF-2:3.20.6

Instructions

  1. Create media to be exported by the XDM profile. Use the demographics listed below:
    • Name = FILLMORE^MARCUS
    • Patient ID = 13520
    • DOB = 19750428
    • Sex = M
  2. Perform the media export, generating an ATNA log message (see Supplemental Information). Transmit the ATNA log message to the MESA Audit Record Repository (port 4000).


Evaluation

To evaluate your response to this test:

  1. From $MESA_TARGET/mesa_tests/iti/actors/media_crt, run the evaluation script
         perl  13520/eval_13520.pl 4
  2. The output file is 13520/grade_13520.txt. This test is successfully completed when the last line in the output file indicates 0 errors.
  3. Submit the grade file as the results for this test.


Supplemental Information

Test Case 13521: XDM Media Creator, Sample CD - CD-R Option

In test 13521, the Media Creator submits sample media to the Project Manager. The goal is to make your sample available to Connectathon participants for their review.

Please submit your samples two weeks in advance of the general deadline. This will give the Media Importers time to review your samples.

References


Instructions

  1. Use your application to create CDR media to be exported by the XDM profile, and place an ISO image of that CD in a zip file using the name of your system in Gazelle: eg Your_Gazelle_SystemName_13521.zip where 13521 refers to this test number.
  2. Upload the zip file into the Wiki page ([[1]]) reserved for samples for the current connectathon. Do not load the samples directly into Gazelle as the results for this test. There should be an existing link you can click that will allow you to upload the file. If your system is not listed, contact the Project Manager or add the link to the table.
  3. Create a short txt file that briefly explains the data
  4. . Upload that txt file into the Gzelle system as the results for this test.

Evaluation

This test is evaluated by other participants. Should they find errors or issues with your submission, you will be required to repair your media and submit a new sample. This is an iterative (though imperfect) process.

Supplemental Information

Test Case 13522: XDM Media Creator, Sample USB

In test 13522, the Media Creator submits sample media to the Project Manager. The goal is for the Project Manager to redistribute to other participants for their review.

Please submit your samples two weeks in advance of the general deadline. This will give the Media Importers time to review your samples.


Instructions

  1. Create USB media to be exported by the XDM profile.
  2. You will sample to the Project Manager for distribution to other participants. Submit a physical USB drive with capacity of at least 1 GB. All media becomes the property of the Project Manager and will not be returned.
  3. When your sample is ready, submit a file <SystemName>-13522-readme.txt that briefly explains the data and upload it as results for this test. You will receive a note from the project manager containing instructions for submitting your sample USB.

Test Case 13523: XDM Media Creator, Sample CD - ZIP over Email Option

In test 13523, the Media Creator sends a secure email message to the Project Manager with a zip file attachment as specified by the XDM profile


Instructions

  1. Create ZIP media to be exported by the XDM profile.
  2. Email the sample to the Project Manager per directions in the XDM profile (SMIME)
  3. Create a short txt file that briefly explains the data, and a note indicating that you have emailed your sample. Upload that txt file into the Kudu system as the results for this test.

Test Case 11962: Portable Media Creator - Submit XDM content for evaluation

Instructions The instructions for this test reside on the IHE NIST XDS test kit website, linked on the Index to IHE Test tools: http://wiki.ihe.net/index.php?title=IHE_Test_Tool_Information#Index_to_IHE_Test_Tools