Advancing Out-of-Distribution Detection via Local Neuroplasticity
Abstract
In the domain of machine learning, the assumption that training and test data share the same distribution is often violated in real-world scenarios, requiring effective out-of-distribution (OOD) detection. This paper presents a novel OOD detection method that leverages the unique local neuroplasticity property of Kolmogorov-Arnold Networks (KANs). Unlike traditional multilayer perceptrons, KANs exhibit local plasticity, allowing them to preserve learned information while adapting to new tasks. Our method compares the activation patterns of a trained KAN against its untrained counterpart to detect OOD samples. We validate our approach on benchmarks from image and medical domains, demonstrating superior performance and robustness compared to state-of-the-art techniques. These results underscore the potential of KANs in enhancing the reliability of machine learning systems in diverse environments.
Cite
Text
Canevaro et al. "Advancing Out-of-Distribution Detection via Local Neuroplasticity." International Conference on Learning Representations, 2025.Markdown
[Canevaro et al. "Advancing Out-of-Distribution Detection via Local Neuroplasticity." International Conference on Learning Representations, 2025.](https://mlanthology.org/iclr/2025/canevaro2025iclr-advancing/)BibTeX
@inproceedings{canevaro2025iclr-advancing,
title = {{Advancing Out-of-Distribution Detection via Local Neuroplasticity}},
author = {Canevaro, Alessandro and Schmidt, Julian and Marvi, Mohammad Sajad and Yu, Hang and Martius, Georg and Jordan, Julian},
booktitle = {International Conference on Learning Representations},
year = {2025},
url = {https://mlanthology.org/iclr/2025/canevaro2025iclr-advancing/}
}