Difference between revisions of "Connectathon Tasks in Gazelle"

From IHE Wiki
Jump to navigation Jump to search
(Created page with "The Gazelle Test Management software supports many Connectathon tasks. This page provides a list of those tasks, but does not include training. You do not need to use all of t...")
 
Line 27: Line 27:
 
# Add new host configurations for a participant on demand.
 
# Add new host configurations for a participant on demand.
 
# Update any network configuration as requested by a participant.
 
# Update any network configuration as requested by a participant.
 +
# Manage Connectathon monitors
 +
## Manage monitor accounts
 +
## Assign monitors to tests. In some events, all monitors review all test cases. For other events, monitors are specialized and only review test cases in their area of expertise.
  
 
=Pre-Connectathon: Participants=
 
=Pre-Connectathon: Participants=
Line 37: Line 40:
  
 
=Connectathon: Management=
 
=Connectathon: Management=
 +
# Review overall Connectathon progress. There are dashboards that show
 +
## Test results by participant
 +
## Test results by monitor
 +
## Test results by domain or profile
 +
# Review the work queues of the Connectathon monitors.
 +
## If there are areas with a backlog of tests, you may need to reassign monitors to reduce that backlog.
 +
# Perform Connectathon grading. This is a manual step where you review the test instances that have been validated by your monitor team. If a participant has completed sufficient test cases for an actor/profile/option triplet, you can mark that as passed.
 +
## You can enter notes for any triplet for a participant. Maybe you need them to run one more XYZ test to complete their work.
  
 
=Connectathon: Participants=
 
=Connectathon: Participants=
Line 45: Line 56:
 
## Start the test
 
## Start the test
 
## Work through all of the steps
 
## Work through all of the steps
 +
## Upload evidence that will be attached to this specific test instance. This includes log files, screen captures, ...
 
## Submit the test for validation
 
## Submit the test for validation
 
# Review overall progress of testing. There is a dashboard that presents that data.
 
# Review overall progress of testing. There is a dashboard that presents that data.
Line 50: Line 62:
 
# Validate a sample file using one of the Gazelle validation packages.
 
# Validate a sample file using one of the Gazelle validation packages.
 
# Download sample files that have been shared by partners.
 
# Download sample files that have been shared by partners.
 +
# Read configuration parameters entered by partner (same method available during pre-Connectathon phase).
  
 +
=Connectathon: Monitors=
 +
# Execute validation cycle:
 +
## Search for the next test to validate. There is a work list that is presented based on assignments made by the Connectathon manager. You can perform further filtering.
 +
## Claim one or more tests to validate.
 +
## Review the test evidence presented by the participants. This can include reviewing evidence that has been entered by participants as well as visiting the participants to see the transactions in real time.
 +
## Assign a test result (verified, partially verified, ....). If the test is not marked as verified, you will likely have some more discussions with the participants about what needs to be done to satisfy the test requirements.
  
 
=Post-Connectathon: Management=
 
=Post-Connectathon: Management=

Revision as of 13:04, 7 March 2019

The Gazelle Test Management software supports many Connectathon tasks. This page provides a list of those tasks, but does not include training. You do not need to use all of the features listed below.

Pre-Registration: Management

  1. Create a testing session that includes a label, start date and end date. This provides a container to manage tests and results separately from other testing session. There are configuration items. We will mention the main ones.
    1. Required: Add the set of profiles to be tested. You can select entire domains (all of ITI) or a subset of profiles to match your objectives.
    2. Optional: Configure Gazelle to generate a contract that the participants can download and sign. The contract software includes the ability to generate a price depending on registration parameters.
    3. Optional: Configure Gazelle to accept registration for individual staff member badges.

Registration: Participants

  1. Add/manage organization names and user accounts.
  2. Enter a system or systems in the assigned test session.
    1. Enter simple demographics for the system.
    2. Enter a detailed list of profile/actor/option triplets to be tested. This drives the testing process.

Registration: Management

  1. Monitor the registration to determine if there are sufficient partners to test each profile that is offered.
  2. Assist participants with their registration. A typical issue is that they did not understand the need to register for a required profile. You may need to help them update their registration.
  3. Assist participants with account and password issues.
    1. Each organization has at least one administrator who is supposed to review/enable individual accounts. There are times when the administrator is away, and you will be asked by an individual to enable their account.

Pre-Connectathon: Management

  1. Review pre-Connectathon tests submitted by participants. This is to gauge their readiness for the event itself.
  2. Enter parameters in Gazelle to describe the network configuration.
    1. This will allow you to use Gazelle to automatically generate fixed IP addresses for a private network.
  3. Generate default network configuration for all participants. These are the endpoints for connections and related items such as DICOM AE titles, HL7 application names, full URLs, etc.
  4. Assign fixed IP addresses to individual hosts for each participant (in batch or individually)
  5. Add new host configurations for a participant on demand.
  6. Update any network configuration as requested by a participant.
  7. Manage Connectathon monitors
    1. Manage monitor accounts
    2. Assign monitors to tests. In some events, all monitors review all test cases. For other events, monitors are specialized and only review test cases in their area of expertise.

Pre-Connectathon: Participants

  1. Execute pre-Connectathon tests. Gazelle generates a customized list of tests based on individual registration. Participant can:
    1. Read the customized list of tests to be performed before the event.
    2. Upload results/evidence for Connectathon manager review.
  2. Read Connectathon test descriptions. Gazelle generates a customized list of tests based on registration. Participant can ## Note that you will want to coordinate with Domain Project Managers. You will want to be aware if the Domain Project Managers are improving the test cases during the time your participants are reading the tests or executing them.
  3. Enter configuration data for their systems. They start from the default values generated by the Gazelle Test Management system. They can update those as necessary.
  4. Read configuration data from peers and possibly download that data through a spreadsheet. This allows the individual participant to quickly determine the endpoints and other required data to communicate with partners.

Connectathon: Management

  1. Review overall Connectathon progress. There are dashboards that show
    1. Test results by participant
    2. Test results by monitor
    3. Test results by domain or profile
  2. Review the work queues of the Connectathon monitors.
    1. If there are areas with a backlog of tests, you may need to reassign monitors to reduce that backlog.
  3. Perform Connectathon grading. This is a manual step where you review the test instances that have been validated by your monitor team. If a participant has completed sufficient test cases for an actor/profile/option triplet, you can mark that as passed.
    1. You can enter notes for any triplet for a participant. Maybe you need them to run one more XYZ test to complete their work.

Connectathon: Participants

  1. Execute Connectathon tests using this cycle:
    1. Search for a test to run. You might filter on a domain, a profile, an actor, ...
    2. Select a test case.
    3. Select test partners for that test case
    4. Start the test
    5. Work through all of the steps
    6. Upload evidence that will be attached to this specific test instance. This includes log files, screen captures, ...
    7. Submit the test for validation
  2. Review overall progress of testing. There is a dashboard that presents that data.
  3. Upload sample files (e.g., CDA documents) to be shared with test partners.
  4. Validate a sample file using one of the Gazelle validation packages.
  5. Download sample files that have been shared by partners.
  6. Read configuration parameters entered by partner (same method available during pre-Connectathon phase).

Connectathon: Monitors

  1. Execute validation cycle:
    1. Search for the next test to validate. There is a work list that is presented based on assignments made by the Connectathon manager. You can perform further filtering.
    2. Claim one or more tests to validate.
    3. Review the test evidence presented by the participants. This can include reviewing evidence that has been entered by participants as well as visiting the participants to see the transactions in real time.
    4. Assign a test result (verified, partially verified, ....). If the test is not marked as verified, you will likely have some more discussions with the participants about what needs to be done to satisfy the test requirements.

Post-Connectathon: Management

  1. Complete the overall grading process if not completed during the event.
    1. This includes making any changes as a result of challenges from participants.
  2. Kereval can take the results from the database and publish directly in the IHE results database.

Post-Connectathon: Participants

  1. Review Connectathon evaluation (pass, did not complete, withdrawn).
    1. Through a separate channel such as email, the participant can ask the Connectathon manager to review the evaluation.
  2. Review individual test results for engineering purposes.
  3. Retrieve any sample files that were stored during the event for engineering purposes.