Sparse/Robust Estimation and Kalman Smoothing with Nonsmooth Log-Concave Densities: Modeling, Computation, and Theory

Abstract

We introduce a new class of quadratic support (QS) functions, many of which already play a crucial role in a variety of applications, including machine learning, robust statistical inference, sparsity promotion, and inverse problems such as Kalman smoothing. Well known examples of QS penalties include the $\ell_2$, Huber, $\ell_1$ and Vapnik losses. We build on a dual representation for QS functions, using it to characterize conditions necessary to interpret these functions as negative logs of true probability densities. This interpretation establishes the foundation for statistical modeling with both known and new QS loss functions, and enables construction of non-smooth multivariate distributions with specified means and variances from simple scalar building blocks.

Cite

Text

Aravkin et al. "Sparse/Robust Estimation and Kalman Smoothing with Nonsmooth Log-Concave Densities: Modeling, Computation, and Theory." Journal of Machine Learning Research, 2013.

Markdown

[Aravkin et al. "Sparse/Robust Estimation and Kalman Smoothing with Nonsmooth Log-Concave Densities: Modeling, Computation, and Theory." Journal of Machine Learning Research, 2013.](https://mlanthology.org/jmlr/2013/aravkin2013jmlr-sparse/)

BibTeX

@article{aravkin2013jmlr-sparse,
  title     = {{Sparse/Robust Estimation and Kalman Smoothing with Nonsmooth Log-Concave Densities: Modeling, Computation, and Theory}},
  author    = {Aravkin, Aleksandr Y. and Burke, James V. and Pillonetto, Gianluigi},
  journal   = {Journal of Machine Learning Research},
  year      = {2013},
  pages     = {2689-2728},
  volume    = {14},
  url       = {https://mlanthology.org/jmlr/2013/aravkin2013jmlr-sparse/}
}