Infinite-Dimensional Optimization and Bayesian Nonparametric Learning of Stochastic Differential Equations
Abstract
The paper has two major themes. The first part of the paper establishes certain general results for infinite-dimensional optimization problems on Hilbert spaces. These results cover the classical representer theorem and many of its variants as special cases and offer a wider scope of applications. The second part of the paper then develops a systematic approach for learning the drift function of a stochastic differential equation by integrating the results of the first part with Bayesian hierarchical framework. Importantly, our Bayesian approach incorporates low-cost sparse learning through proper use of shrinkage priors while allowing proper quantification of uncertainty through posterior distributions. Several examples at the end illustrate the accuracy of our learning scheme.
Cite
Text
Ganguly et al. "Infinite-Dimensional Optimization and Bayesian Nonparametric Learning of Stochastic Differential Equations." Journal of Machine Learning Research, 2023.Markdown
[Ganguly et al. "Infinite-Dimensional Optimization and Bayesian Nonparametric Learning of Stochastic Differential Equations." Journal of Machine Learning Research, 2023.](https://mlanthology.org/jmlr/2023/ganguly2023jmlr-infinitedimensional/)BibTeX
@article{ganguly2023jmlr-infinitedimensional,
title = {{Infinite-Dimensional Optimization and Bayesian Nonparametric Learning of Stochastic Differential Equations}},
author = {Ganguly, Arnab and Mitra, Riten and Zhou, Jinpu},
journal = {Journal of Machine Learning Research},
year = {2023},
pages = {1-39},
volume = {24},
url = {https://mlanthology.org/jmlr/2023/ganguly2023jmlr-infinitedimensional/}
}