Revealing architectural order with quantitative label-free imaging and deep learning. eLife Guo, S. M., Yeh, L. H., Folkesson, J. n., Ivanov, I. E., Krishnan, A. P., Keefe, M. G., Hashemi, E. n., Shin, D. n., Chhun, B. B., Cho, N. H., Leonetti, M. D., Han, M. H., Nowakowski, T. n., Mehta, S. B. 2020; 9

Abstract

We report quantitative label-free imaging with phase and polarization (QLIPP) for simultaneous measurement of density, anisotropy, and orientation in unlabeled live cells and tissue slices. We combine QLIPP with deep neural networks to predict fluorescence images of diverse cell and tissue structures. QLIPP images reveal anatomical regions and axon tract orientation in prenatal human brain tissue sections that are not visible using brightfield imaging. We report a variant of UNet architecture, multi-channel 2.5D U-Net, for computationally efficient prediction of fluorescence images in three dimensions and over large fields of view. Further, we develop data normalization methods for accurate prediction of myelin distribution over large brain regions. We show that experimental defects in labeling the human tissue can be rescued with quantitative label-free imaging and neural network model. We anticipate that the proposed method will enable new studies of architectural order at spatial scales ranging from organelles to tissue.

View details for DOI 10.7554/eLife.55502

View details for PubMedID 32716843