A Universal Error Measure for Input Predictions Applied to Online Graph Problems
Abstract
We introduce a novel measure for quantifying the error in input predictions. The error is based on a minimum-cost hyperedge cover in a suitably defined hypergraph and provides a general template which we apply to online graph problems. The measure captures errors due to absent predicted requests as well as unpredicted actual requests; hence, predicted and actual inputs can be of arbitrary size. We achieve refined performance guarantees for previously studied network design problems in the online-list model, such as Steiner tree and facility location. Further, we initiate the study of learning-augmented algorithms for online routing problems, such as the online traveling salesperson problem and the online dial-a-ride problem, where (transportation) requests arrive over time (online-time model). We provide a general algorithmic framework and we give error-dependent performance bounds that improve upon known worst-case barriers, when given accurate predictions, at the cost of slightly increased worst-case bounds when given predictions of arbitrary quality.
Cite
Text
Bernardini et al. "A Universal Error Measure for Input Predictions Applied to Online Graph Problems." Neural Information Processing Systems, 2022.Markdown
[Bernardini et al. "A Universal Error Measure for Input Predictions Applied to Online Graph Problems." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/bernardini2022neurips-universal/)BibTeX
@inproceedings{bernardini2022neurips-universal,
title = {{A Universal Error Measure for Input Predictions Applied to Online Graph Problems}},
author = {Bernardini, Giulia and Lindermayr, Alexander and Marchetti-Spaccamela, Alberto and Megow, Nicole and Stougie, Leen and Sweering, Michelle},
booktitle = {Neural Information Processing Systems},
year = {2022},
url = {https://mlanthology.org/neurips/2022/bernardini2022neurips-universal/}
}