Distance-Informed Neural Processes
Abstract
We propose the Distance-informed Neural Process (DNP), a novel variant of Neural Processes that improves uncertainty estimation by combining global and distance-aware local latent structures. Standard Neural Processes (NPs) often rely on a global latent variable and struggle with uncertainty calibration and capturing local data dependencies. DNP addresses these limitations by introducing a global latent variable to model task-level variations and a local latent variable to capture input similarity within a distance-preserving latent space. This is achieved through bi-Lipschitz regularization, which bounds distortions in input relationships and encourages the preservation of relative distances in the latent space. This modeling approach allows DNP to produce better-calibrated uncertainty estimates and more effectively distinguish in- from out-of-distribution data. Empirical results demonstrate that DNP achieves strong predictive performance and improved uncertainty calibration across regression and classification tasks.
Cite
Text
Venkataramanan and Denzler. "Distance-Informed Neural Processes." Advances in Neural Information Processing Systems, 2025.Markdown
[Venkataramanan and Denzler. "Distance-Informed Neural Processes." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/venkataramanan2025neurips-distanceinformed/)BibTeX
@inproceedings{venkataramanan2025neurips-distanceinformed,
title = {{Distance-Informed Neural Processes}},
author = {Venkataramanan, Aishwarya and Denzler, Joachim},
booktitle = {Advances in Neural Information Processing Systems},
year = {2025},
url = {https://mlanthology.org/neurips/2025/venkataramanan2025neurips-distanceinformed/}
}