Nearly Tight Bounds for Exploration in Streaming Multi-Armed Bandits with Known Optimality Gap

Cite

Text

Karpov and Wang. "Nearly Tight Bounds for Exploration in Streaming Multi-Armed Bandits with Known Optimality Gap." AAAI Conference on Artificial Intelligence, 2025. doi:10.1609/AAAI.V39I17.33956

Markdown

[Karpov and Wang. "Nearly Tight Bounds for Exploration in Streaming Multi-Armed Bandits with Known Optimality Gap." AAAI Conference on Artificial Intelligence, 2025.](https://mlanthology.org/aaai/2025/karpov2025aaai-nearly/) doi:10.1609/AAAI.V39I17.33956

BibTeX

@inproceedings{karpov2025aaai-nearly,
  title     = {{Nearly Tight Bounds for Exploration in Streaming Multi-Armed Bandits with Known Optimality Gap}},
  author    = {Karpov, Nikolai and Wang, Chen},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2025},
  pages     = {17788-17796},
  doi       = {10.1609/AAAI.V39I17.33956},
  url       = {https://mlanthology.org/aaai/2025/karpov2025aaai-nearly/}
}