MCMT-GAN: Multi-Task Coherent Modality Transferable GAN for 3D Brain Image Synthesis
Document Type
Article
Publication Title
IEEE Transactions on Image Processing
Abstract
The ability to synthesize multi-modality data is highly desirable for many computer-aided medical applications, e.g. clinical diagnosis and neuroscience research, since rich imaging cohorts offer diverse and complementary information unraveling human tissues. However, collecting acquisitions can be limited by adversary factors such as patient discomfort, expensive cost and scanner unavailability. In this paper, we propose a multi-task coherent modality transferable GAN (MCMT-GAN) to address this issue for brain MRI synthesis in an unsupervised manner. Through combining the bidirectional adversarial loss, cycle-consistency loss, domain adapted loss and manifold regularization in a volumetric space, MCMT-GAN is robust for multi-modality brain image synthesis with visually high fidelity. In addition, we complement discriminators collaboratively working with segmentors which ensure the usefulness of our results to segmentation task. Experiments evaluated on various cross-modality synthesis show that our method produces visually impressive results with substitutability for clinical post-processing and also exceeds the state-of-the-art methods.
First Page
8187
Last Page
8198
DOI
10.1109/TIP.2020.3011557
Publication Date
7-29-2020
Keywords
anatomical structure, brain MRI, GANs, multi-modality, Synthesis
Recommended Citation
Y. Huang, F. Zheng, R. Cong, W. Huang, M. R. Scott and L. Shao, "MCMT-GAN: Multi-Task Coherent Modality Transferable GAN for 3D Brain Image Synthesis," in IEEE Transactions on Image Processing, vol. 29, pp. 8187-8198, 2020, doi: 10.1109/TIP.2020.3011557.
Additional Links
DOI link: https://doi.org/10.1109/TIP.2020.3011557
Comments
IR Deposit conditions:
OA version (pathway a) Accepted version
No embargo
When accepted for publication, set statement to accompany deposit (see policy)
Must link to publisher version with DOI
Publisher copyright and source must be acknowledged