Nonuniformity of P-Values Can Occur Early in Diverging Dimensions

Abstract

Evaluating the joint significance of covariates is of fundamental importance in a wide range of applications. To this end, p-values are frequently employed and produced by algorithms that are powered by classical large-sample asymptotic theory. It is well known that the conventional p-values in Gaussian linear model are valid even when the dimensionality is a non-vanishing fraction of the sample size, but can break down when the design matrix becomes singular in higher dimensions or when the error distribution deviates from Gaussianity. A natural question is when the conventional p-values in generalized linear models become invalid in diverging dimensions. We establish that such a breakdown can occur early in nonlinear models. Our theoretical characterizations are confirmed by simulation studies.

Cite

Text

Fan et al. "Nonuniformity of P-Values Can Occur Early in Diverging Dimensions." Journal of Machine Learning Research, 2019.

Markdown

[Fan et al. "Nonuniformity of P-Values Can Occur Early in Diverging Dimensions." Journal of Machine Learning Research, 2019.](https://mlanthology.org/jmlr/2019/fan2019jmlr-nonuniformity/)

BibTeX

@article{fan2019jmlr-nonuniformity,
  title     = {{Nonuniformity of P-Values Can Occur Early in Diverging Dimensions}},
  author    = {Fan, Yingying and Demirkaya, Emre and Lv, Jinchi},
  journal   = {Journal of Machine Learning Research},
  year      = {2019},
  pages     = {1-33},
  volume    = {20},
  url       = {https://mlanthology.org/jmlr/2019/fan2019jmlr-nonuniformity/}
}