Aigner KM, Bärmann A, Braun K, Liers F, Pokutta S, Schneider O, Sharma K, Tschuppik S (2023)
Publication Language: English
Publication Status: Submitted
Publication Type: Journal article, Original article
Future Publication Type: Journal article
Publication year: 2023
URI: https://opus4.kobv.de/opus4-trr154/frontdoor/index/index/docId/496
Stochastic Optimization (SO) is a classical approach for optimization under uncertainty that typically requires knowledge about the probability distribution of uncertain parameters. As the latter is often unknown, Distributionally Robust Optimization (DRO) provides a strong alternative that determines the best guaranteed solution over a set of distributions (ambiguity set). In this work, we present an approach for DRO over time that uses online learning and scenario observations arriving as a data stream to learn more about the uncertainty. Our robust solutions adapt over time and reduce the cost of protection with shrinking ambiguity. For various kinds of ambiguity sets, the robust solutions converge to the SO solution. Our algorithm achieves the optimization and learning goals without solving the DRO problem exactly at any step. We also provide a regret bound for the quality of the online strategy which converges at a rate of $ O(\log T / \sqrt{T})$, where $T$ is the number of iterations. Furthermore, we illustrate the effectiveness of our procedure by numerical experiments on mixed-integer optimization instances from popular benchmark libraries and give practical examples stemming from telecommunications and routing. Our algorithm is able to solve the DRO over time problem significantly faster than standard reformulations.
APA:
Aigner, K.-M., Bärmann, A., Braun, K., Liers, F., Pokutta, S., Schneider, O.,... Tschuppik, S. (2023). Data-driven Distributionally Robust Optimization over Time. INFORMS Journal on Optimization. https://doi.org/10.1287/ijoo.2023.0091
MLA:
Aigner, Kevin-Martin, et al. "Data-driven Distributionally Robust Optimization over Time." INFORMS Journal on Optimization (2023).
BibTeX: Download