Deep Generative Adversarial Neural Networks for Compressive Sensing MRI IEEE TRANSACTIONS ON MEDICAL IMAGING Mardani, M., Gong, E., Cheng, J. Y., Vasanawala, S. S., Zaharchuk, G., Xing, L., Pauly, J. M. 2019; 38 (1): 167–79

Abstract

Undersampled magnetic resonance image (MRI) reconstruction is typically an ill-posed linear inverse task. The time and resource intensive computations require tradeoffs between accuracy and speed. In addition, state-of-the-art compressed sensing (CS) analytics are not cognizant of the image diagnostic quality. To address these challenges, we propose a novel CS framework that uses generative adversarial networks (GAN) to model the (low-dimensional) manifold of high-quality MR images. Leveraging a mixture of least-squares (LS) GANs and pixel-wise l1/l2 cost, a deep residual network with skip connections is trained as the generator that learns to remove the aliasing artifacts by projecting onto the image manifold. The LSGAN learns the texture details, while the l1/l2 cost suppresses high-frequency noise. A discriminator network, which is a multilayer convolutional neural network (CNN), plays the role of a perceptual cost that is then jointly trained based on high-quality MR images to score the quality of retrieved images. In the operational phase, an initial aliased estimate (e.g., simply obtained by zero-filling) is propagated into the trained generator to output the desired reconstruction. This demands a very low computational overhead. Extensive evaluations are performed on a large contrast-enhanced MR dataset of pediatric patients. Images rated by expert radiologists corroborate that GANCS retrieves higher quality images with improved fine texture details compared with conventional Wavelet-based and dictionary-learning-based CS schemes as well as with deep-learning-based schemes using pixel-wise training. In addition, it offers reconstruction times of under a few milliseconds, which are two orders of magnitude faster than the current state-of-the-art CS-MRI schemes.

View details for DOI 10.1109/TMI.2018.2858752

View details for Web of Science ID 000455110500017

View details for PubMedID 30040634