Posterior Consistency for Bayesian Relevance Vector Machines

Abstract

Statistical modeling and inference problems with sample sizes substantially smaller than the number of available covariates are challenging. Chakraborty et al. (2012) did a full hierarchical Bayesian analysis of nonlinear regression in such situations using relevance vector machines based on reproducing kernel Hilbert space (RKHS). But they did not provide any theoretical properties associated with their procedure. The present paper revisits their problem, introduces a new class of global-local priors different from theirs, and provides results on posterior consistency as well as on posterior contraction rates.

Cite

Text

Fang and Ghosh. "Posterior Consistency for Bayesian Relevance Vector Machines." Journal of Machine Learning Research, 2023.

Markdown

[Fang and Ghosh. "Posterior Consistency for Bayesian Relevance Vector Machines." Journal of Machine Learning Research, 2023.](https://mlanthology.org/jmlr/2023/fang2023jmlr-posterior/)

BibTeX

@article{fang2023jmlr-posterior,
  title     = {{Posterior Consistency for Bayesian Relevance Vector Machines}},
  author    = {Fang, Xiao and Ghosh, Malay},
  journal   = {Journal of Machine Learning Research},
  year      = {2023},
  pages     = {1-17},
  volume    = {24},
  url       = {https://mlanthology.org/jmlr/2023/fang2023jmlr-posterior/}
}