AIRAI Checkpoint Assessment: Difference between revisions

From IHE Wiki
Jump to navigation Jump to search
Created page with "= AIRAI Checkpoint Assessment = * Describe gaps in Use Case coverage * Review ALL "uncertainty points" in the evaluation. Is there a resolution plan for each? * Do the effort..."
 
Line 1: Line 1:
= AIRAI Checkpoint Assessment =
* Describe gaps in Use Case coverage
** I think the expectation between participants as to what use cases have to be addressed hugely differed


* Describe gaps in Use Case coverage
* Review ALL "uncertainty points" in the evaluation. Is there a resolution plan for each?
* Review ALL "uncertainty points" in the evaluation. Is there a resolution plan for each?
** Discussion was centered around general concept this week and is not resolved. None of the other topics have even been looked at
* Do the effort points in the evaluation still seem right?
* Do the effort points in the evaluation still seem right?
** Since concept is not decided on no statement possible
* Did the Breakdown of Tasks accurately reflect the work? What extra tasks arose?
* Did the Breakdown of Tasks accurately reflect the work? What extra tasks arose?
** Since concept is not decided on no statement possible
* Describe unresolved technical issues/tasks
* Describe unresolved technical issues/tasks
** Not decided yet, whether to use KOS objects for flagging status of AI results or AIR result tree object
* Describe potential practical issues
* Describe potential practical issues
** Since concept is not decided on no statement possible
* Review the open issue list. Does it feel complete
* Review the open issue list. Does it feel complete
** There is no open issues list yet
* Which open issues feel most risky; what other risks exist?
* Which open issues feel most risky; what other risks exist?
** No open issues yet
* How is the work fitting in the allocated bandwidth? (Time to spare? Just right? Things were left undone?)
* How is the work fitting in the allocated bandwidth? (Time to spare? Just right? Things were left undone?)
** Due to not agreeing to the underlying concept yet, no work on the profile has been done
* How does the scope feel? (Room to expand? Just right? Pretty ambitious?)
* How does the scope feel? (Room to expand? Just right? Pretty ambitious?)
** Due to discussions on the scope and a potential change in underlying technology, there is no statement possible yet
* If you had to reduce scope, what would you drop?
* If you had to reduce scope, what would you drop?
* Have the promised resources manifested?
* Have the promised resources manifested?
** Yes
* What tasks would benefit from additional expertise?
* What tasks would benefit from additional expertise?
** TBD
* What vendors are engaged for each actor? Record how many.
* What vendors are engaged for each actor? Record how many.
* Was the profile where it needed to be at the start of the Kickoff meeting (See "Kickoff Meeting" above), if not what was the gap
* Was the profile where it needed to be at the start of the Kickoff meeting (See "Kickoff Meeting" above), if not what was the gap
** Yes it was, however since the underlying concept was rejected, the profile document has not been looked at at all
* Was the profile where it needed to be at the end of the Kickoff meeting, if not what was the gap
* Was the profile where it needed to be at the end of the Kickoff meeting, if not what was the gap
** No since we could not agree on the underlying concpet
* How many tcons would you like between now and the PC Prep Meeting?
* How many tcons would you like between now and the PC Prep Meeting?
** At least three

Revision as of 09:10, 15 November 2024

  • Describe gaps in Use Case coverage
    • I think the expectation between participants as to what use cases have to be addressed hugely differed
  • Review ALL "uncertainty points" in the evaluation. Is there a resolution plan for each?
    • Discussion was centered around general concept this week and is not resolved. None of the other topics have even been looked at
  • Do the effort points in the evaluation still seem right?
    • Since concept is not decided on no statement possible
  • Did the Breakdown of Tasks accurately reflect the work? What extra tasks arose?
    • Since concept is not decided on no statement possible
  • Describe unresolved technical issues/tasks
    • Not decided yet, whether to use KOS objects for flagging status of AI results or AIR result tree object
  • Describe potential practical issues
    • Since concept is not decided on no statement possible
  • Review the open issue list. Does it feel complete
    • There is no open issues list yet
  • Which open issues feel most risky; what other risks exist?
    • No open issues yet
  • How is the work fitting in the allocated bandwidth? (Time to spare? Just right? Things were left undone?)
    • Due to not agreeing to the underlying concept yet, no work on the profile has been done
  • How does the scope feel? (Room to expand? Just right? Pretty ambitious?)
    • Due to discussions on the scope and a potential change in underlying technology, there is no statement possible yet
  • If you had to reduce scope, what would you drop?
  • Have the promised resources manifested?
    • Yes
  • What tasks would benefit from additional expertise?
    • TBD
  • What vendors are engaged for each actor? Record how many.
  • Was the profile where it needed to be at the start of the Kickoff meeting (See "Kickoff Meeting" above), if not what was the gap
    • Yes it was, however since the underlying concept was rejected, the profile document has not been looked at at all
  • Was the profile where it needed to be at the end of the Kickoff meeting, if not what was the gap
    • No since we could not agree on the underlying concpet
  • How many tcons would you like between now and the PC Prep Meeting?
    • At least three