Self-Attention Capsule Network for Tissue Classification in Case of Challenging Medical Image Statistics

Abstract

We propose the first Self-Attention Capsule Network that was designed to deal with unique core challenges of medical imaging, specifically for tissue classification. These challenges are - significant data heterogeneity with statistics variability across imaging domains, insufficient spatial context and local fine-grained details, and limited training data. Moreover, our proposed method solves limitations of the baseline Capsule Networks (CapsNet) such as handling complicated challenging data and limited computational resources. To cope with these challenges, our method is composed of a self-attention module that simplifies the complexity of the input data such that the CapsNet routing mechanism can be efficiently used, while extracting much richer contextual information, compared with CNNs. To demonstrate the strengths of our method, it was extensively evaluated on three diverse medical datasets and three natural benchmarks. The proposed method outperformed other methods we compared with in classification accuracy but also in robustness, within and across different datasets and domains.

Cite

Text

Hoogi et al. "Self-Attention Capsule Network for Tissue Classification in Case of Challenging Medical Image Statistics." European Conference on Computer Vision Workshops, 2022. doi:10.1007/978-3-031-25066-8_10

Markdown

[Hoogi et al. "Self-Attention Capsule Network for Tissue Classification in Case of Challenging Medical Image Statistics." European Conference on Computer Vision Workshops, 2022.](https://mlanthology.org/eccvw/2022/hoogi2022eccvw-selfattention/) doi:10.1007/978-3-031-25066-8_10

BibTeX

@inproceedings{hoogi2022eccvw-selfattention,
  title     = {{Self-Attention Capsule Network for Tissue Classification in Case of Challenging Medical Image Statistics}},
  author    = {Hoogi, Assaf and Wilcox, Brian and Gupta, Yachee and Rubin, Daniel L.},
  booktitle = {European Conference on Computer Vision Workshops},
  year      = {2022},
  pages     = {219-235},
  doi       = {10.1007/978-3-031-25066-8_10},
  url       = {https://mlanthology.org/eccvw/2022/hoogi2022eccvw-selfattention/}
}