Improved PAC-Bayesian Bounds for Linear Regression

Abstract

In this paper, we improve the PAC-Bayesian error bound for linear regression derived in Germain et al. (2016). The improvements are two-fold. First, the proposed error bound is tighter, and converges to the generalization loss with a well-chosen temperature parameter. Second, the error bound also holds for training data that are not independently sampled. In particular, the error bound applies to certain time series generated by well-known classes of dynamical models, such as ARX models.

Cite

Text

Shalaeva et al. "Improved PAC-Bayesian Bounds for Linear Regression." AAAI Conference on Artificial Intelligence, 2020. doi:10.1609/AAAI.V34I04.6020

Markdown

[Shalaeva et al. "Improved PAC-Bayesian Bounds for Linear Regression." AAAI Conference on Artificial Intelligence, 2020.](https://mlanthology.org/aaai/2020/shalaeva2020aaai-improved/) doi:10.1609/AAAI.V34I04.6020

BibTeX

@inproceedings{shalaeva2020aaai-improved,
  title     = {{Improved PAC-Bayesian Bounds for Linear Regression}},
  author    = {Shalaeva, Vera and Esfahani, Alireza Fakhrizadeh and Germain, Pascal and Petreczky, Mihály},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2020},
  pages     = {5660-5667},
  doi       = {10.1609/AAAI.V34I04.6020},
  url       = {https://mlanthology.org/aaai/2020/shalaeva2020aaai-improved/}
}