Experimental Design on a Budget for Sparse Linear Models and Applications

Abstract

Budget constrained optimal design of experiments is a classical problem in statistics. Although the optimal design literature is very mature, few efficient strategies are available when these design problems appear in the context of sparse linear models commonly encountered in high dimensional machine learning and statistics. In this work, we study experimental design for the setting where the underlying regression model is characterized by a \ell_1-regularized linear function. We propose two novel strategies: the first is motivated geometrically whereas the second is algebraic in nature. We obtain tractable algorithms for this problem and also hold for a more general class of sparse linear models. We perform an extensive set of experiments, on benchmarks and a large multi-site neuroscience study, showing that the proposed models are effective in practice. The latter experiment suggests that these ideas may play a small role in informing enrollment strategies for similar scientific studies in the short-to-medium term future.

Cite

Text

Ravi et al. "Experimental Design on a Budget for Sparse Linear Models and Applications." International Conference on Machine Learning, 2016.

Markdown

[Ravi et al. "Experimental Design on a Budget for Sparse Linear Models and Applications." International Conference on Machine Learning, 2016.](https://mlanthology.org/icml/2016/ravi2016icml-experimental/)

BibTeX

@inproceedings{ravi2016icml-experimental,
  title     = {{Experimental Design on a Budget for Sparse Linear Models and Applications}},
  author    = {Ravi, Sathya Narayanan and Ithapu, Vamsi and Johnson, Sterling and Singh, Vikas},
  booktitle = {International Conference on Machine Learning},
  year      = {2016},
  pages     = {583-592},
  volume    = {48},
  url       = {https://mlanthology.org/icml/2016/ravi2016icml-experimental/}
}