Talk:APW-EDM White Paper

From IHE Wiki
Jump to navigation Jump to search

Discussion Page for APW-EDM White Paper

NOTE FROM NCJ: Per Gunter's suggestion, I made the discussion page, and I'm moving a lot of my comments/questions that I had written in text over to here. Hopefully that will make the main page for drafting the white paper less messy and more readable. -Nicholas C. Jones (talk) 15:24, 13 March 2018 (CDT)


Discussion: Use Case #1: Image Slides for Secondary Review / Consultation

Regarding 1c: NOTE FROM NCJ: Dr. Dash: can you verify that we should separate this out from 1b or should we put it as a sub-category under 1b, since the physician here is going to get pt consent for the consultation anyways? In my experience though, these are of noteworthy difference both in clinical processes and in likelihood of impact /avg value from 1b. -Nicholas C. Jones (talk) 15:24, 13 March 2018 (CDT)

Regarding time frames for returning glass slides: (NOTE FROM NCJ: In the US, the norm is 30 days, though I believe this is a norm rather than a law. Are there differences in practice in Europe and other areas of the world?) -Nicholas C. Jones (talk) 15:24, 13 March 2018 (CDT)

Regarding 1e: NOTE FROM NCJ: I am not familiar enough with these processes to go into depth here. It's even worth questioning that we want this in the profile, as it's only semi-clinical. But I think there's at least benefit in mentioning that this is something to consider for system and workflow design so the vendors are at least aware of this as a different use case. What do the rest of the white paper team think? -Nicholas C. Jones (talk) 15:24, 13 March 2018 (CDT)

NOTE FROM NCJ: Hopefully between the contexts listed above, the consultation contexts ppt & discussion in the teleconference, and future discussion, we can all get to the same page on why these contexts have significant differences in workflow. The big question to me is do we want to make 1 big decision tree profile for all consultation, different trees for different contexts, or do the other authors have other suggestions? I must note that this is a big area where there is a gap in functionality for many vendor products, because these nuances have not been explained in white paper form previously. So while we might recoil from the complexity here, I think this is a key area of value of the white paper, but I am open to constructive criticism here. -Nicholas C. Jones (talk) 15:12, 13 March 2018 (CDT)

Discussion: Use Case #2: Immunohistochemistry Positive Control Slides

Discussion: Use Case #3: Managing Digital Assets for Anatomic Pathology Clinical Workflows

Discussion: Use Case #4: Sharing and Cooperating on Gross Examination Images

Discussion: Use Case #5: Incorporation of Legacy Digital Images for Use in APW

Discussion: Use Case #6: Image Analysis, Machine Learning and In Silico Workflows

Discussion: Use Case #7: Quality Control / Quality Assurance and Error Correction Workflows to Support Digital Pathology

NCJ Note: I'm not sure we if we want to consider this its own use-case explicitly; these are workflows inherent to all scanning processes (details will vary by use case the QC/QA process is covering). But these are part of workflow and should be modeled. The other option is to consider this section as modeling for scanning processes in general...

NCJ Note: Quality control and quality assurance are somewhat polysemous terms. See my ppt from shared dropbox area for some details. For clarity here, I will define 3 terms for use within the document. We can debate these if desired.

Definitions: Quality Evaluation (QE): An assessment from a device, software algorithm, or person about the quality of a target physical or digital asset.

NCJ Note: My definition here, just to make the distinction between the common vernacular of QC - as in "he QC'd the image" versus the formal definition of quality control. I'm open to using a different phrase if desired. But an important distinction here is that the scanners and scanning software have internal QE tools, which we would want to discriminate from technician QE, pathologist QE, and potential external algorithmic analytic QEs. Note that the scanner/software/tech QEs have been part of both FDA clinical validations and technical validations. (See FDA technical guidance. Quality Control (QC): A system for verifying and maintaining a desired level of quality in an individual test or process. (From traditional CAP definition. My note: usually implies prospective processes, and to generate data.)

Quality Assurance (QA): Systemic monitoring of quality control results and quality practice parameters to assure that all systems are functioning in a manner appropriate to excellence in health care delivery. (From traditional CAP definition.)

Specimen (Definition from DICOM Supplement 122) A physical object (or a collection of objects) is a specimen when the laboratory considers it a single discrete, uniquely identified unit that is the subject of one or more steps in the laboratory (diagnostic) workflow. Can have parent specimens (or the patient as the ancestor/source) and can have child specimens (i.e. a TAH-BSO had the source/parent of a patient, and children specimens of the uterus, two fallopian tubes, two ovaries, all of which can have multiple blocks, with multiple slides, each which can be scanned.) A specimen could even be materials removed from a slide via laser microdissection. The major thing to understand here is the extensibility of the parent to children relationships.

Container(Also see DICOM supplement): A physical object containing a specimen. This could be the jar filled with formalin holding an excised organ upon delivery to surgical pathology laboratory, the cartridge holding a block (consisting of parrafin wax and tissue cut from the organ/bx), or the slide itself encasing the stained tissue. A container usually only carries specimens from one source, but in some cases could hold specimens of multiple sources (such as a TMA).

-Nicholas C. Jones (talk) 17:53, 13 March 2018 (CDT)

Discussion: Use Case #8: Digital Pathology in Support of Clinical Conferences

Discussion: Use Case #9: Sub-contracting for special analyses on specimens

Discussion: Use Case #10: Image Registration Functions

Regarding 10.c: NOTE FROM NCJ: Does anyone know of any groups using image registration from WSIs/blocks to gross images in a clinical environment? If so, we should separate that out from the "future use" section.) -Nicholas C. Jones (talk) 15:24, 13 March 2018 (CDT)

Discussion: Use Case #11: Digital Pathology in Support of Intraoperative Procedures

I think we could batch frozen section interpretations and rapid FNA adequacy assessments here together, but it anyone wants to split them into different use cases, that would be fine as well. -Nicholas C. Jones (talk) 15:24, 13 March 2018 (CDT)

I also think we should note that frozen section and rapid FNA slides often require additional steps for identification. Some sites may only manually write on the labels here, or may later print labels to affix to these labels. At the time of the intraoperative interpretation of the images, such slidse may not have optimal, formal, automated identification set up, so systems should allow for identifcation exception handling here and allow for reconciliation later in the process (i.e. after permanents come out). Lack of formal workflow or system design here can create opportunities for error, and good process design will allow for quality metrics that can be reviewed as part of QC/QA processes.