Smoothness-Adaptive Dynamic Pricing with Nonparametric Demand Learning

Abstract

We study the dynamic pricing problem where the demand function is nonparametric and Hölder smooth, and we focus on adaptivity to the unknown Hölder smoothness parameter $\beta$ of the demand function. Traditionally the optimal dynamic pricing algorithm heavily relies on the knowledge of $\beta$ to achieve a minimax optimal regret of $\widetilde{O}(T^{\frac{\beta+1}{2\beta+1}})$. However, we highlight the challenge of adaptivity in this dynamic pricing problem by proving that no pricing policy can adaptively achieve this minimax optimal regret without knowledge of $\beta$. Motivated by the impossibility result, we propose a self-similarity condition to enable adaptivity. Importantly, we show that the self-similarity condition does not compromise the problem’s inherent complexity since it preserves the regret lower bound $\Omega(T^{\frac{\beta+1}{2\beta+1}})$. Furthermore, we develop a smoothness-adaptive dynamic pricing algorithm and theoretically prove that the algorithm achieves this minimax optimal regret bound without the prior knowledge $\beta$.

Cite

Text

Ye and Jiang. "Smoothness-Adaptive Dynamic Pricing with Nonparametric Demand Learning." Artificial Intelligence and Statistics, 2024.

Markdown

[Ye and Jiang. "Smoothness-Adaptive Dynamic Pricing with Nonparametric Demand Learning." Artificial Intelligence and Statistics, 2024.](https://mlanthology.org/aistats/2024/ye2024aistats-smoothnessadaptive/)

BibTeX

@inproceedings{ye2024aistats-smoothnessadaptive,
  title     = {{Smoothness-Adaptive Dynamic Pricing with Nonparametric Demand Learning}},
  author    = {Ye, Zeqi and Jiang, Hansheng},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2024},
  pages     = {1675-1683},
  volume    = {238},
  url       = {https://mlanthology.org/aistats/2024/ye2024aistats-smoothnessadaptive/}
}