Histological staining of tissue biopsies, especially hematoxylin and
eosin (H&E) staining, serves as the benchmark for disease
diagnosis and comprehensive clinical assessment of tissue. However,
the typical formalin-fixation, paraffin-embedding (FFPE) process is
laborious and time consuming, often limiting its usage in
time-sensitive applications such as surgical margin assessment. To
address these challenges, we combine an emerging 3D quantitative phase
imaging technology, termed quantitative oblique back illumination
microscopy (qOBM), with an unsupervised generative adversarial network
pipeline to map qOBM phase images of unaltered thick tissues (i.e.,
label- and slide-free) to virtually stained H&E-like (vH&E)
images. We demonstrate that the approach achieves high-fidelity
conversions to H&E with subcellular detail using fresh tissue
specimens from mouse liver, rat gliosarcoma, and human gliomas. We
also show that the framework directly enables additional capabilities
such as H&E-like contrast for volumetric imaging. The quality and
fidelity of the vH&E images are validated using both a neural
network classifier trained on real H&E images and tested on
virtual H&E images, and a user study with neuropathologists. Given
its simple and low-cost embodiment and ability to provide real-time
feedback in vivo, this
deep-learning-enabled qOBM approach could enable new workflows for
histopathology with the potential to significantly save time, labor,
and costs in cancer screening, detection, treatment guidance, and
more.