Multi-Task Accelerated MR Reconstruction Schemes for Jointly Training Multiple Contrasts
Abstract
Model-based accelerated MRI reconstruction methods leverage large datasets to reconstruct diagnostic-quality images from undersampled k-space. These networks require matching training and test time distributions to achieve high quality reconstructions. However, there is inherent variability in MR datasets, including different contrasts, orientations, anatomies, and institution-specific protocols. The current paradigm is to train separate models for each dataset. However, this is a demanding process and cannot exploit information that may be shared amongst datasets. To address this issue, we propose multi-task learning (MTL) schemes that can jointly reconstruct multiple datasets. We test multiple MTL architectures and weighted loss functions against single task learning (STL) baselines. Our quantitative and qualitative results suggest that MTL can outperform STL across a range of dataset ratios for two knee contrasts.
Cite
Text
Liu et al. "Multi-Task Accelerated MR Reconstruction Schemes for Jointly Training Multiple Contrasts." NeurIPS 2021 Workshops: Deep_Inverse, 2021.Markdown
[Liu et al. "Multi-Task Accelerated MR Reconstruction Schemes for Jointly Training Multiple Contrasts." NeurIPS 2021 Workshops: Deep_Inverse, 2021.](https://mlanthology.org/neuripsw/2021/liu2021neuripsw-multitask-a/)BibTeX
@inproceedings{liu2021neuripsw-multitask-a,
title = {{Multi-Task Accelerated MR Reconstruction Schemes for Jointly Training Multiple Contrasts}},
author = {Liu, Victoria and Ryu, Kanghyun and Alkan, Cagan and Pauly, John M. and Vasanawala, Shreyas},
booktitle = {NeurIPS 2021 Workshops: Deep_Inverse},
year = {2021},
url = {https://mlanthology.org/neuripsw/2021/liu2021neuripsw-multitask-a/}
}