Make your plans to attend
Vaccination required
>
Conference 11960 > Paper 11960-38
Paper 11960-38

Improving the histological realism of ultraviolet photoacoustic remote sensing microscopy images using a deep learning-based generative adversarial network

Abstract

Following resection of cancerous tissues, specimens are excised from the surgical margins to be examined post-operatively for the presence of residual cancer cells. Hematoxylin and eosin (H&E) staining is the gold standard of histopathological assessment. Ultraviolet photoacoustic microscopy (UV-PARS), combined with scattering microscopy, provides virtual nuclei and cytoplasm contrast similar to H&E staining. A generative adversarial network (GAN) deep learning approach, specifically a CycleGAN, was used to perform style transfer to improve the histological realism of UV-PARS generated images. Post-CycleGAN images are easier for a pathologist to examine and can be input into existing machine learning pipelines for H&E-stained images.

Presenter

Univ. of Alberta (Canada)
Author
Univ. of Alberta (Canada)
Presenter/Author
Univ. of Alberta (Canada)
Author
Univ. of Alberta (Canada)
Author
Univ. of Alberta (Canada)
Author
Univ. of Alberta (Canada)
Author
Univ. of Alberta (Canada)
Author
Univ. of Alberta (Canada)
Author
Univ. of Alberta (Canada)