Difference between revisions of "Connectathon Using Gazelle Test Management Only"
(→Connectathon Using Gazelle Test Management Only)
(→Perform Overall Grading)
|Line 104:||Line 104:|
===Perform Overall Grading===
===Perform Overall Grading===
Latest revision as of 10:14, 8 February 2020
Connectathon Using Gazelle Test Management Only
- You have access to a copy of Gazelle Test Management:
- You are using an existing managed copy
- You have an agreement with Kereval for them to install (or manage) a copy for you.
- Both of these options include database replication to have all appropriate test definitions. You write test definitions in the Gazelle Master Model, and these are replicated to individual copies of Gazelle through an automated process.
- All profile definitions and test definitions that you need are entered in Gazelle Master Model.
- Profile definitions and test definitions are normally entered by the Domain Technical Project Manager. You should consult with the appropriate manager or managers, especially if you are extending tests in a domain.
- If you have created a new domain or own your own profile, that implies you have entered the profile definitions and test definitions in the Gazelle Master Model.
- You have reviewed all test definitions / requirements and know:
- They do not need extra tools (or you have those tools)
- You have support on site or remotely to respond to participant questions. The amount of support required depends on the nature of the event. A strict testing evening might require more support. A collaborative event that is designed to educate and not post results might require less support.
This is not a User's Manual, but it does provide a general set of steps to follow in Gazelle Test Management. Some of the items in the list below have further detail in the next section.
- Create a Testing Session in Gazelle Test Management. This is not a full list, but some essential items are:
- Registration close date / time
- Start and end date / time of the Connectathon
- List of profiles to be tested (only include those you want to test).
- Decide if you want to use Gazelle Test Management to generate a contract.
- You do not need to use the contract generated by Gazelle Test Management to run a Connectathon.
- If you do want that contract, there are some manual steps outside the Web UI. Please contact Kereval.
- Prune profiles that do not have sufficient test partners
- Decide if you want to use Gazelle to capture system configuration
- Enter and assign test monitors
- Provide training to participants and monitors
- Repeatedly execute test cases with individual test validation
- Provide overall grading. Review the body of work for each system/profile/actor and assign a grade (pass, did not complete, ....)
Gazelle Test Management Details
Pruning Profiles Without Sufficient Registration
IHE Integration Profiles include two or more actors in their requirements. You might find that a profile you offer does not have enough registration to support testing. That could mean:
- You have only one or two systems registered for a specific actor (less than the nominal three that are required)
- You have zero systems registered for a specific actor (meaning the other system or systems will not have a test partner)
The issue that is raised here is when do you discover such issues. There are different approaches with different pros and cons.
- Connectathon Manager reviews registration at the close of registration and looks for these gaps.
- Connectathon Manager delegates this review to a Domain Manager or some other kind of person, say a person who is managing a demonstration at a conference. The technical backgrounds are very similar
- You tell participants to perform the review and request assistance if there are concerns. This method has been used at times. You will likely find that participants will not actually perform the review until they reach the event unless they are registered for only a small number of profiles.
Capturing System Configuration
Gazelle Test Management is able to record configuration information for each system based on their registration. Examples of configuration include host names, port numbers, full URLs, HL7 V2 application names. If a profile requires that an actor provide a web services endpoint, the Domain Project Manager will make an entry in Gazelle Master Model to indicate this requirement. After the close of registration for a Connectathon:
- The Connectathon Manager can activate a button that generates default configuration values for all systems; then
- Each system owner can review the configurations, make the proper adjustments, and declare the configurations complete; then
- All participants can see configuration information for all partners and pull in the relevant information
Some additional points to consider:
- This is helpful for large events but might be overkill for a small event. You might decide to manage configuration information using other methods like wiki pages or online spreadsheets.
- This feature becomes more important if you are using the Gazelle Proxy. The Proxy will use configuration information that has been entered.
Execute Test Cases
Gazelle Test Management allows participants to start individual test cases, record evidence and comments concerning the test case, and manage the state of the test case (in progress, to be verified, verified, paused, ...). The test cases have been written by Domain Project Managers with the assumption that Gazelle Test Management is used. There are some cases that assume other tools, but that will be covered in another section.
In a Connectathon that is only using Gazelle Test Management and no other tools, the software allows participants and management to do the following:
|See the list of tests to perform.|| The list of tests to perform is derived from the registration provided by each participant and data entered previously in Gazelle Master Model.
Each system owner will see a list of tests that is tailored to the system registration.
|Select test partner or partners.|| A participant can select an individual test and then work through a menu to select appropriate test partners.
If registration is done properly, each participant will know the roles that are played by other participants and do not have to spend time canvassing partners.
|Execute a test, including attaching evidence (screen captures, log files, ...) to the test|| Tests that are well written include requests for specific data.
You might decide at a smaller event to not upload the evidence at each test but to have the monitor review the tests in real time. That is certainly feasible and can provide a closer inspection of the process. You are more likely to see the user interface of a participant system and to provide some useful feedback.
|Manage the test lifecyle||
Assigning Test Monitors
Monitor assignments and managing their worklists is essential for larger events but also useful for smaller events where you have decided to use Gazelle Test Management to record the results of individual tests. Gazelle Test Management can record the following information:
- Name of each monitor
- Photo of each monitor (optional)
- Table where monitor is sitting (optional)
- List of tests the monitor will verify
If you choose to enter this information, you can tailor the monitor duties based on their expertise. You can assign one or more monitors to a specific set of profiles to take advantage of their background. You would also assign other monitors to different profiles. This information is used as follows:
- After a participant places a test in the To Be Verified state, that test will appear in the Gazelle Monitor Worklist.
- Monitors review the worklist on an ongoing basis. They can see a new test that is in their area, claim the test, and then work with the participants to verify the test.
- Participants can use the optional information (photo, table location) to locate the monitor.
If you are using Gazelle Test Management for a smaller event, the Monitor Worklist is still useful. Even if you do not assign specific tests to individual monitors, the worklist can still show all of the tests that are ready for validation with timestamps. This is useful in itself to make sure that participant work is not inadvertently missed. There are enough filtering items on the worklist page that a monitor can look for specific items (again assuming that individual assignments were not made).
Perform Overall Grading
The published outcome of a Connectathon is a list of profile/actor pairs that have been successfully tested by each participating organization. That is, for each organization, IHE International publishes that the organization successfully tested actor Alpha in profile Beta at the Connectathon Gamma.
- IHE International does not publish test results for options. These are tested but not reported due to some software limitations.
- IHE International does not report that an organization attempted to test a profile/actor pair but did not complete testing. Such conditions may be beyond the control of the participant and publishing such a result might be misleading.
- Results are recorded at the organizational level and not at the system level.
- An organization that brings two different systems with similar capabilities will only see a combined result. This is common in Radiology where a company might be two or three Acquisition Modality systems.
- There is no identifying system information. There is no indication of the actual product and/or software version.
Connectathon Manager and/or responsible delegates use screens in the Gazelle Test Management system to review the work completed by each organization and to assign an overall assessment for each profile actor pair (Pass, Withdrawn, Did Not Complete, ...). We often mark systems as Pass during the event once we determine they have completed the required work. A participant can see this status and know they can focus their attention on other work.
The Connectathon Manager and/or delegates complete the overall grading at the end of the event in Gazelle Test Management. These results are private to participants. They can review the results and present evidence while requesting adjustments. After this work is finalized, the Gazelle Test Management software can export a report that can be given to the Kereval team for inclusion in the published Connectathon Result Matrix. After that final step, the results are now public. Individual Gazelle Test Management systems do not provide a page that shows a public report; one has to go to the IHE Connectathon Results system.