AIRAI Checkpoint Assessment
Kick Off Meeting
- Describe gaps in Use Case coverage
- The expectation between participants as to what use cases have to be addressed by the profile hugely differed
- All use cases have been confirmed to be in scope.
- The original Basic Statistics use case was significantly enlarged in scope and increased in importance. This created additional uncertainty as to where the effort for this profile should be focused on.
- The importance and details associated with each case were discussed to a large degree, and the importance of the use cases and the scenarios within a use case were debated and remain uncertain.
- The implementation of the profile depends largely on finalization of these discussions as it will affect the data and underlying mechanisms of the profile.
- Review ALL "uncertainty points" in the evaluation. Is there a resolution plan for each?
- The uncertainty points were not given to Creation and Consumption use cases which turned out to be more controversial than expected. The plan is to prepare a more detailed definition of the use cases. Have a first follow-up t-con to discuss and formalize these descriptions
- Discussion was centered around the Assessment Record Method this week and is not resolved. The uncertainty of this subject has not decreased and probably even increased. The plan is to have a second follow-up t-con to compare prepared detailed record methods vis-a-vis extended use case definitions. Authors will create shared documents on the Google Drive and other participants will contribute to.
- Undiscussed uncertainties include edge use cases, packaging and grouping. Plan to be established on the results of the two t-con.
- Do the effort points in the evaluation still seem right?
- Since concept is not decided on, hard to judge
- Did the Breakdown of Tasks accurately reflect the work? What extra tasks arose?
- The tasks are roughly correct, but we underestimated uncertainty and amount of preparation required.
- In retrospect, a better recognition of uncertainty would suggest an advanced committee t-con to map out the resolution strategy that authors could bring to the kick-off meeting.
- Describe unresolved technical issues/tasks
- Not decided yet, see uncertainty section.
- Describe potential practical issues
- Since concept is not decided, not entirely clear
- Issues with large volumes of objects were raised as potential impacts on the practicality of implementation.
- Issues with complexity of objects
- Issues with coverage of use cases
- Review the open issue list. Does it feel complete
- There are no open issues list yet
- Which open issues feel most risky; what other risks exist?
- The large uncertainty is the largest risk
- How is the work fitting in the allocated bandwidth? (Time to spare? Just right? Things were left undone?)
- Things were left undone. Have not completed uncertainty resolution, which prevented work on subsequent tasks
- How does the scope feel? (Room to expand? Just right? Pretty ambitious?)
- Feeling more ambitious than expected
- More debates than expected on priority of various parts of the scope
- If you had to reduce scope, what would you drop?
- Unless the uncertainty resolution is achieved, one strategy is to fallback to a whitepaper to achieve resolution this cycle
- If use cases and mechanisms partition in a well-separated fashion, may be able to do one use case and mechanism.
- Have the promised resources manifested?
- Yes; only really promised opinions.
- What tasks would benefit from additional expertise?
- Not at this point
- What vendors are engaged for each actor? Record how many.
- Most engagement is indirect through IHE-Europe Task force AIGI. Siemens Healtheeners (depending on the use cases adopted), Merge, VISUS, Visage (depending on the use cases adopted).
- Was the profile where it needed to be at the start of the Kickoff meeting (See "Kickoff Meeting" above), if not what was the gap
- Yes, based on minimal guidance on the early steps.
- One choice for underlying concepts was well documented, however, alternatives were not available in advance
- Thus resolution of uncertainties was limited.
- In retrospect, large uncertainty point items (>1) should trigger discovery and documentation of main alternatives and methods of comparison/assessment.
- Was the profile where it needed to be at the end of the Kickoff meeting, if not what was the gap
- No; major uncertainties are unresolved
- How many t-cons would you like between now and the PC Prep Meeting?
- At least three
Public Comment Preparation Meeting
- Profile Name: AI Result Assessment for Imaging (AIRAI)
- Did we line-by-line the entire document
- No, two more two hour tcons needed
- How ready is it to go out for PC: Completely, Almost, Soonish, Hmmm
- Editors will incorporate feedback from face to face
- Individual members review and tons for line-by-line review needed.
- Which open issues are risky, and why
- Open Issue 5 may lead to some complex but necessary discussions.
- Are all open issues phrased to solicit the needed information to close them?
- Yes (but some may be missing)
- Which use cases need more input
- Want general feedback on all use cases from a variety of facilities. Want to explore the variation in patterns they have, hoping they all boil down to the same core.
- Which issues from the Kickoff Closing Assessment are still unresolved
- Encoding issues are largely resolved with the use of an assessment object. Large volumes of objects handled because we are not using individual KOS.
- Complexity issues mostly resolved by keeping the assessment object fairly flat, perhaps with nesting for multiple assessors/assessments.
- What significant debates in PC-prep were not anticipated in the Kickoff
- Review ALL "uncertainty points" in the evaluation. Are all now resolved?
- Review ALL "complexity points" in the evaluation. Did each get appropriate text coverage/resolution?
- Review the "effort points" in the evaluation. Still seems right? Need more?
- How does the scope feel in terms of being a useful chunk of work? (Needs more? Just right? More than enough?)
- How is the work fitting in the allocated bandwidth? (Time to spare? Just right? Things were left undone?)
- Did the Breakdown of Tasks accurately reflect the work? What extra tasks arose?
- Looking forward, if you had to reduce scope to hit TI, what would you drop
- Have the promised resources manifested
- What vendors are engaged (for each actor)
- When will we have sample data/objects
- Who should specifically be targeted for Public Comment feedback
- Was the profile where it needed to be at the start of the PC meeting (See "PC Prep Meeting" above), if not what was the gap
- Was the profile where it needed to be at the end of the PC meeting, if not what was the gap
- How many tcons would you like between now and PC Publication
- Do you need any tcons before TI Prep Meeting