Nonlinear Information-Theoretic Compressive Measurement Design

Abstract

We investigate design of general nonlinear functions for mapping high-dimensional data into a lower-dimensional (compressive) space. The nonlinear measurements are assumed contaminated by additive Gaussian noise. Depending on the application, we are either interested in recovering the high-dimensional data from the nonlinear compressive measurements, or performing classification directly based on these measurements. The latter case corresponds to classification based on nonlinearly constituted and noisy features. The nonlinear measurement functions are designed based on constrained mutual-information optimization. New analytic results are developed for the gradient of mutual information in this setting, for arbitrary input-signal statistics. We make connections to kernel-based methods, such as the support vector machine. Encouraging results are presented on multiple datasets, for both signal recovery and classification. The nonlinear approach is shown to be particularly valuable in high-noise scenarios.

Cite

Text

Wang et al. "Nonlinear Information-Theoretic Compressive Measurement Design." International Conference on Machine Learning, 2014.

Markdown

[Wang et al. "Nonlinear Information-Theoretic Compressive Measurement Design." International Conference on Machine Learning, 2014.](https://mlanthology.org/icml/2014/wang2014icml-nonlinear/)

BibTeX

@inproceedings{wang2014icml-nonlinear,
  title     = {{Nonlinear Information-Theoretic Compressive Measurement Design}},
  author    = {Wang, Liming and Razi, Abolfazl and Rodrigues, Miguel and Calderbank, Robert and Carin, Lawrence},
  booktitle = {International Conference on Machine Learning},
  year      = {2014},
  pages     = {1161-1169},
  volume    = {32},
  url       = {https://mlanthology.org/icml/2014/wang2014icml-nonlinear/}
}