How Important Are Specialized Transforms in Neural Operators?
Abstract
Transform-based Neural Operators like Fourier Neural Operators and Wavelet Neural Operators have received a lot of attention for their potential to provide fast solutions for systems of Partial Differential Equations. In this work, we investigate what could be the cost in performance, if all the transform layers are replaced by learnable linear layers. We observe that linear layers suffice to provide performance comparable to best-known transform-based layers and seem to do so at possibly a compute time advantage as well. We believe that this observation can have significant implications for future work on Neural Operators.
Cite
Text
Majumdar et al. "How Important Are Specialized Transforms in Neural Operators?." ICML 2023 Workshops: SynS_and_ML, 2023.Markdown
[Majumdar et al. "How Important Are Specialized Transforms in Neural Operators?." ICML 2023 Workshops: SynS_and_ML, 2023.](https://mlanthology.org/icmlw/2023/majumdar2023icmlw-important/)BibTeX
@inproceedings{majumdar2023icmlw-important,
title = {{How Important Are Specialized Transforms in Neural Operators?}},
author = {Majumdar, Ritam and Karande, Shirish and Vig, Lovekesh},
booktitle = {ICML 2023 Workshops: SynS_and_ML},
year = {2023},
url = {https://mlanthology.org/icmlw/2023/majumdar2023icmlw-important/}
}