Mixing Predictions or Online Metric Algorithms

Antonios Antoniadis, Christian Coester, Marek Eliáš, Adam Polak, Bertrand Simon

Research output: Working paperPreprintAcademic

24 Downloads (Pure)

Abstract

A major technique in learning-augmented online algorithms is combining multiple algorithms or predictors. Since the performance of each predictor may vary over time, it is desirable to use not the single best predictor as a benchmark, but rather a dynamic combination which follows different predictors at different times. We design algorithms that combine predictions and are competitive against such dynamic combinations for a wide class of online problems, namely, metrical task systems. Against the best (in hindsight) unconstrained combination of ℓ predictors, we obtain a competitive ratio of O(ℓ2), and show that this is best possible. However, for a benchmark with slightly constrained number of switches between different predictors, we can get a (1+ϵ)-competitive algorithm. Moreover, our algorithms can be adapted to access predictors in a bandit-like fashion, querying only one predictor at a time. An unexpected implication of one of our lower bounds is a new structural insight about covering formulations for the k-server problem.
Original languageEnglish
PublisherArXiv.org
DOIs
Publication statusPublished - 15 Dec 2023

Fingerprint

Dive into the research topics of 'Mixing Predictions or Online Metric Algorithms'. Together they form a unique fingerprint.
  • Mixing Predictions for Online Metric Algorithms

    Antoniadis, A., Coester, C., Eliáš, M., Polak, A. & Simon, B., Jul 2023, p. 969–983. 15 p.

    Research output: Contribution to conferencePaperpeer-review

    Open Access
    File
    9 Downloads (Pure)

Cite this