On Incorporating Scale into Graph Networks
Abstract
Standard graph neural networks assign vastly different latent embeddings to graphs describing the same physical system at different resolution scales. This precludes consistency in applications and prevents generalization between scales as would fundamentally be needed in many scientific applications. We uncover the underlying obstruction, investigate its origin and show how to overcome it.
Cite
Text
Koke et al. "On Incorporating Scale into Graph Networks." ICLR 2025 Workshops: MLMP, 2025.Markdown
[Koke et al. "On Incorporating Scale into Graph Networks." ICLR 2025 Workshops: MLMP, 2025.](https://mlanthology.org/iclrw/2025/koke2025iclrw-incorporating/)BibTeX
@inproceedings{koke2025iclrw-incorporating,
title = {{On Incorporating Scale into Graph Networks}},
author = {Koke, Christian and Shen, Yuesong and Saroha, Abhishek and Eisenberger, Marvin and Rieck, Bastian and Bronstein, Michael M. and Cremers, Daniel},
booktitle = {ICLR 2025 Workshops: MLMP},
year = {2025},
url = {https://mlanthology.org/iclrw/2025/koke2025iclrw-incorporating/}
}