Continuous U-Net: Faster, Greater and Noiseless

Abstract

Image segmentation is a fundamental task in image analysis and clinical practice. The current state-of-the-art techniques are based on U-shape type encoder-decoder networks with skip connections called U-Net. Despite the powerful performance reported by existing U-Net type networks, they suffer from several major limitations. These issues include the hard coding of the receptive field size, compromising the performance and computational cost, as well as the fact that they do not account for inherent noise in the data. They have problems associated with discrete layers, and do not offer any theoretical underpinning. In this work we introduce continuous U-Net, a novel family of networks for image segmentation. Firstly, continuous U-Net is a continuous deep neural network that introduces new dynamic blocks modelled by second order ordinary differential equations. Secondly, we provide theoretical guarantees for our network demonstrating faster convergence, higher robustness and less sensitivity to noise. Thirdly, we derive qualitative measures to tailor-made segmentation tasks. We demonstrate, through extensive numerical and visual results, that our model outperforms existing U-Net blocks for several medical image segmentation benchmarking datasets.

Cite

Text

Cheng et al. "Continuous U-Net: Faster, Greater and Noiseless." Transactions on Machine Learning Research, 2024.

Markdown

[Cheng et al. "Continuous U-Net: Faster, Greater and Noiseless." Transactions on Machine Learning Research, 2024.](https://mlanthology.org/tmlr/2024/cheng2024tmlr-continuous/)

BibTeX

@article{cheng2024tmlr-continuous,
  title     = {{Continuous U-Net: Faster, Greater and Noiseless}},
  author    = {Cheng, Chun-Wun and Runkel, Christina and Liu, Lihao and Chan, Raymond H. and Schönlieb, Carola-Bibiane and Aviles-Rivero, Angelica I},
  journal   = {Transactions on Machine Learning Research},
  year      = {2024},
  url       = {https://mlanthology.org/tmlr/2024/cheng2024tmlr-continuous/}
}