Over-Parameterized Deep Nonparametric Regression for Dependent Data with Its Applications to Reinforcement Learning

Abstract

In this paper, we provide statistical guarantees for over-parameterized deep nonparametric regression in the presence of dependent data. By decomposing the error, we establish non-asymptotic error bounds for deep estimation, which is achieved by effectively balancing the approximation and generalization errors. We have derived an approximation result for H{\"o}lder functions with constrained weights. Additionally, the generalization error is bounded by the weight norm, allowing for a neural network parameter number that is much larger than the training sample size. Furthermore, we address the issue of the curse of dimensionality by assuming that the samples originate from distributions with low intrinsic dimensions. Under this assumption, we are able to overcome the challenges posed by high-dimensional spaces. By incorporating an additional error propagation mechanism, we derive oracle inequalities for the over-parameterized deep fitted $Q$-iteration.

Cite

Text

Feng et al. "Over-Parameterized Deep Nonparametric Regression for Dependent Data with Its Applications to Reinforcement Learning." Journal of Machine Learning Research, 2023.

Markdown

[Feng et al. "Over-Parameterized Deep Nonparametric Regression for Dependent Data with Its Applications to Reinforcement Learning." Journal of Machine Learning Research, 2023.](https://mlanthology.org/jmlr/2023/feng2023jmlr-overparameterized/)

BibTeX

@article{feng2023jmlr-overparameterized,
  title     = {{Over-Parameterized Deep Nonparametric Regression for Dependent Data with Its Applications to Reinforcement Learning}},
  author    = {Feng, Xingdong and Jiao, Yuling and Kang, Lican and Zhang, Baqun and Zhou, Fan},
  journal   = {Journal of Machine Learning Research},
  year      = {2023},
  pages     = {1-40},
  volume    = {24},
  url       = {https://mlanthology.org/jmlr/2023/feng2023jmlr-overparameterized/}
}