Pure Entropic Regularization for Metrical Task Systems

Abstract

We show that on every $n$-point HST metric, there is a randomized online algorithm for metrical task systems (MTS) that is $1$-competitive for service costs and $O(\log n)$-competitive for movement costs. In general, these refined guarantees are optimal up to the implicit constant. While an $O(\log n)$-competitive algorithm for MTS on HST metrics was developed by Bubeck et al. (2018), that approach could only establish an $O((\log n)^2)$-competitive ratio when the service costs are required to be $O(1)$-competitive. Our algorithm is an instantiation of online mirror descent with the regularizer derived from a multiscale conditional entropy. In fact, our algorithm satisfies a set of even more refined guarantees; we are able to exploit this property to combine it with known random embedding theorems and obtain, for {\em any} $n$-point metric space, a randomized algorithm that is $1$-competitive for service costs and $O((\log n)^2)$-competitive for movement costs.

Cite

Text

Coester and Lee. "Pure Entropic Regularization for Metrical Task Systems." Conference on Learning Theory, 2019.

Markdown

[Coester and Lee. "Pure Entropic Regularization for Metrical Task Systems." Conference on Learning Theory, 2019.](https://mlanthology.org/colt/2019/coester2019colt-pure/)

BibTeX

@inproceedings{coester2019colt-pure,
  title     = {{Pure Entropic Regularization for Metrical Task Systems}},
  author    = {Coester, Christian and Lee, James R.},
  booktitle = {Conference on Learning Theory},
  year      = {2019},
  pages     = {835-848},
  volume    = {99},
  url       = {https://mlanthology.org/colt/2019/coester2019colt-pure/}
}