Difference between revisions of "Connectathon Using Gazelle Test Management Only"

From IHE Wiki
Jump to navigation Jump to search
Line 70: Line 70:
 
If you are using Gazelle Test Management for a smaller event, the Monitor Worklist is still useful. Even if you do not assign specific tests to individual monitors, the worklist can still show all of the tests that are ready for validation with timestamps. This is useful in itself to make sure that participant work is not inadvertently missed. There are enough filtering items on the worklist page that a monitor can look for specific items (again assuming that individual assignments were not made).
 
If you are using Gazelle Test Management for a smaller event, the Monitor Worklist is still useful. Even if you do not assign specific tests to individual monitors, the worklist can still show all of the tests that are ready for validation with timestamps. This is useful in itself to make sure that participant work is not inadvertently missed. There are enough filtering items on the worklist page that a monitor can look for specific items (again assuming that individual assignments were not made).
  
===Execute Test Steps===
+
===Execute Test Cases===
 +
Gazelle Test Management allows participants to start individual test cases, record evidence and comments concerning the test case, and manage the state of the test case (in progress, to be verified, verified, paused, ...). The test cases have been written by Domain Project Managers with the assumption that Gazelle Test Management is used. There are some cases that assume other tools, but that will be covered in another section.
 +
 
 +
In a Connectathon that is only using Gazelle Test Management and no other tools, the software allows participants and management to do the following:
 +
 
 +
{| style="width:100%" border="1" cellpadding="3"
 +
! Item
 +
! Comment
 +
|-
 +
| See the list of tests to perform.
 +
| The list of tests to perform is derived from the registration provided by each participant and data entered previously in Gazelle Master Model.
 +
Each system owner will see a list of tests that is tailored to the system registration.
 +
|-
 +
| Select test partner or partners.
 +
| A participant can select an individual test and then work through a menu to select appropriate test partners.
 +
If registration is done properly, each participant will know the roles that are played by other participants and do not have to spend time canvassing partners.
 +
|-
 +
| Execute a test, including attaching evidence (screen captures, log files, ...) to the test
 +
| Tests that are well written include requests for specific data.
 +
You might decide at a smaller event to not upload the evidence at each test but to have the monitor review the tests in real time. That is certainly feasible and can provide a closer inspection of the process. You are more likely to see the user interface of a participant system and to provide some useful feedback.
 +
|-
 +
| Manage the test lifecyle
 +
|
 +
* Test is started
 +
* Evidence is collected
 +
* Participant marks test To Be Verified
 +
* Monitor reviews test evidence (both in Gazelle Test Management and in person)
 +
* Monitor may add comments and ask participants to do more work. This implies a state change as a signal to participants.
 +
* Monitor and participants eventually come to a final test status.
 +
 
 +
 
 +
|}
  
 
===Perform Overall Grading===
 
===Perform Overall Grading===

Revision as of 10:50, 8 February 2020

Connectathon Using Gazelle Test Management Only

Assumptions

  • You have access to a copy of Gazelle Test Management:
    • You are using an existing managed copy
    • You have an agreement with Kereval for them to install (or manage) a copy for you.
    • Both of these options include database replication to have all appropriate test definitions. You write test definitions in the Gazelle Master Model, and these are replicated to individual copies of Gazelle through an automated process.
  • All profile definitions and test definitions that you need are entered in Gazelle Master Model.
    • Profile definitions and test definitions are normally entered by the Domain Technical Project Manager. You should consult with the appropriate manager or managers, especially if you are extending tests in a domain.
    • If you have created a new domain or own your own profile, that implies you have entered the profile definitions and test definitions in the Gazelle Master Model.
  • You have reviewed all test definitions / requirements and know:
    • They do not need extra tools (or you have those tools)
    • You have support on site or remotely to respond to participant questions. The amount of support required depends on the nature of the event. A strict testing evening might require more support. A collaborative event that is designed to educate and not post results might require less support.

Steps

This is not a User's Manual, but it does provide a general set of steps to follow in Gazelle Test Management. Some of the items in the list below have further detail in the next section.

  1. Create a Testing Session in Gazelle Test Management. This is not a full list, but some essential items are:
    1. Registration close date / time
    2. Start and end date / time of the Connectathon
    3. List of profiles to be tested (only include those you want to test).
  2. Decide if you want to use Gazelle Test Management to generate a contract.
    1. You do not need to use the contract generated by Gazelle Test Management to run a Connectathon.
    2. If you do want that contract, there are some manual steps outside the Web UI. Please contact Kereval.
  3. Prune profiles that do not have sufficient test partners
  4. Decide if you want to use Gazelle to capture system configuration
  5. Enter and assign test monitors
  6. Provide training to participants and monitors
  7. Repeatedly execute test cases with individual test validation
  8. Provide overall grading. Review the body of work for each system/profile/actor and assign a grade (pass, did not complete, ....)

Gazelle Test Management Details

Pruning Profiles Without Sufficient Registration

IHE Integration Profiles include two or more actors in their requirements. You might find that a profile you offer does not have enough registration to support testing. That could mean:

  • You have only one or two systems registered for a specific actor (less than the nominal three that are required)
  • You have zero systems registered for a specific actor (meaning the other system or systems will not have a test partner)

The issue that is raised here is when do you discover such issues. There are different approaches with different pros and cons.

  • Connectathon Manager reviews registration at the close of registration and looks for these gaps.
  • Connectathon Manager delegates this review to a Domain Manager or some other kind of person, say a person who is managing a demonstration at a conference. The technical backgrounds are very similar
  • You tell participants to perform the review and request assistance if there are concerns. This method has been used at times. You will likely find that participants will not actually perform the review until they reach the event unless they are registered for only a small number of profiles.

Capturing System Configuration

Gazelle Test Management is able to record configuration information for each system based on their registration. Examples of configuration include host names, port numbers, full URLs, HL7 V2 application names. If a profile requires that an actor provide a web services endpoint, the Domain Project Manager will make an entry in Gazelle Master Model to indicate this requirement. After the close of registration for a Connectathon:

  1. The Connectathon Manager can activate a button that generates default configuration values for all systems; then
  2. Each system owner can review the configurations, make the proper adjustments, and declare the configurations complete; then
  3. All participants can see configuration information for all partners and pull in the relevant information

Some additional points to consider:

  • This is helpful for large events but might be overkill for a small event. You might decide to manage configuration information using other methods like wiki pages or online spreadsheets.
  • This feature becomes more important if you are using the Gazelle Proxy. The Proxy will use configuration information that has been entered.

Assigning Test Monitors

Monitor assignments and managing their worklists is essential for larger events but also useful for smaller events where you have decided to use Gazelle Test Management to record the results of individual tests. Gazelle Test Management can record the following information:

  • Name of each monitor
  • Photo of each monitor (optional)
  • Table where monitor is sitting (optional)
  • List of tests the monitor will verify

If you choose to enter this information, you can tailor the monitor duties based on their expertise. You can assign one or more monitors to a specific set of profiles to take advantage of their background. You would also assign other monitors to different profiles. This information is used as follows:

  • After a participant places a test in the To Be Verified state, that test will appear in the Gazelle Monitor Worklist.
  • Monitors review the worklist on an ongoing basis. They can see a new test that is in their area, claim the test, and then work with the participants to verify the test.
  • Participants can use the optional information (photo, table location) to locate the monitor.

If you are using Gazelle Test Management for a smaller event, the Monitor Worklist is still useful. Even if you do not assign specific tests to individual monitors, the worklist can still show all of the tests that are ready for validation with timestamps. This is useful in itself to make sure that participant work is not inadvertently missed. There are enough filtering items on the worklist page that a monitor can look for specific items (again assuming that individual assignments were not made).

Execute Test Cases

Gazelle Test Management allows participants to start individual test cases, record evidence and comments concerning the test case, and manage the state of the test case (in progress, to be verified, verified, paused, ...). The test cases have been written by Domain Project Managers with the assumption that Gazelle Test Management is used. There are some cases that assume other tools, but that will be covered in another section.

In a Connectathon that is only using Gazelle Test Management and no other tools, the software allows participants and management to do the following:

Item Comment
See the list of tests to perform. The list of tests to perform is derived from the registration provided by each participant and data entered previously in Gazelle Master Model.

Each system owner will see a list of tests that is tailored to the system registration.

Select test partner or partners. A participant can select an individual test and then work through a menu to select appropriate test partners.

If registration is done properly, each participant will know the roles that are played by other participants and do not have to spend time canvassing partners.

Execute a test, including attaching evidence (screen captures, log files, ...) to the test Tests that are well written include requests for specific data.

You might decide at a smaller event to not upload the evidence at each test but to have the monitor review the tests in real time. That is certainly feasible and can provide a closer inspection of the process. You are more likely to see the user interface of a participant system and to provide some useful feedback.

Manage the test lifecyle
  • Test is started
  • Evidence is collected
  • Participant marks test To Be Verified
  • Monitor reviews test evidence (both in Gazelle Test Management and in person)
  • Monitor may add comments and ask participants to do more work. This implies a state change as a signal to participants.
  • Monitor and participants eventually come to a final test status.


Perform Overall Grading