Progressively volumetrized deep generative models for data-efficient contextual learning of MR image recovery

Date

2022-05

Editor(s)

Advisor

Supervisor

Co-Advisor

Co-Supervisor

Instructor

Source Title

Medical Image Analysis

Print ISSN

1361-8415

Electronic ISSN

1361-8423

Publisher

Elsevier BV

Volume

78

Issue

Pages

[1] - [19]

Language

English

Journal Title

Journal ISSN

Volume Title

Citation Stats
Attention Stats
Usage Stats
0
views
57
downloads

Series

Abstract

Magnetic resonance imaging (MRI) offers the flexibility to image a given anatomic volume under a multi- tude of tissue contrasts. Yet, scan time considerations put stringent limits on the quality and diversity of MRI data. The gold-standard approach to alleviate this limitation is to recover high-quality images from data undersampled across various dimensions, most commonly the Fourier domain or contrast sets. A primary distinction among recovery methods is whether the anatomy is processed per volume or per cross-section. Volumetric models offer enhanced capture of global contextual information, but they can suffer from suboptimal learning due to elevated model complexity. Cross-sectional models with lower complexity offer improved learning behavior, yet they ignore contextual information across the longitu- dinal dimension of the volume. Here, we introduce a novel progressive volumetrization strategy for gen- erative models (ProvoGAN) that serially decomposes complex volumetric image recovery tasks into suc- cessive cross-sectional mappings task-optimally ordered across individual rectilinear dimensions. Provo-GAN effectively captures global context and recovers fine-structural details across all dimensions, while maintaining low model complexity and improved learning behavior. Comprehensive demonstrations on mainstream MRI reconstruction and synthesis tasks show that ProvoGAN yields superior performance to state-of-the-art volumetric and cross-sectional models.

Course

Other identifiers

Book Title

Degree Discipline

Degree Level

Degree Name

Citation

Published Version (Please cite this version)