Nonlinear Reconciliation: Error Reduction Theorems

Abstract

Forecast reconciliation, an ex-post technique applied to forecasts that must satisfy constraints, has been a prominent topic in the forecasting literature over the past two decades. Recently, several efforts have sought to extend reconciliation methods to the probabilistic settings. Nevertheless, formal theorems demonstrating error reduction in nonlinear contexts, analogous to those presented in Panagiotelis et al., (2021), are still lacking. This paper addresses that gap by establishing such theorems for various classes of nonlinear hypersurfaces and vector-valued functions. Specifically, we derive an exact analog of Theorem 3.1 from Panagiotelis et al., (2021) for hypersurfaces with constant-sign curvature. Additionally, we provide an error reduction theorem for the broader case of hypersurfaces with non-constant-sign curvature and for general manifolds with codimension > 1. To support reproducibility and practical adoption, we release a JAX-based Python package, JNLR, implementing the presented theorems and reconciliation procedures.

Cite

Text

Nespoli et al. "Nonlinear Reconciliation: Error Reduction Theorems." Transactions on Machine Learning Research, 2026.

Markdown

[Nespoli et al. "Nonlinear Reconciliation: Error Reduction Theorems." Transactions on Machine Learning Research, 2026.](https://mlanthology.org/tmlr/2026/nespoli2026tmlr-nonlinear/)

BibTeX

@article{nespoli2026tmlr-nonlinear,
  title     = {{Nonlinear Reconciliation: Error Reduction Theorems}},
  author    = {Nespoli, Lorenzo and Biswas, Anubhab and Rocchetta, Roberto and Medici, Vasco},
  journal   = {Transactions on Machine Learning Research},
  year      = {2026},
  url       = {https://mlanthology.org/tmlr/2026/nespoli2026tmlr-nonlinear/}
}