XOR-SGD: Provable Convex Stochastic Optimization for Decision-Making Under Uncertainty
Abstract
Many decision-making problems under uncertainty can be formulated as convex stochastic optimization, which minimizes a convex objective in expectation across exponentially many probabilistic scenarios. Despite its convexity, evaluating the objective function is #P-hard. Previous approaches use samples from MCMC and its variants to approximate the objective function but have a slow mixing rate. We present XOR-SGD, a stochastic gradient descent (SGD) approach guaranteed to converge to solutions that are at most a constant away from the true optimum in linear number of iterations. XOR-SGD harnesses XOR-sampling, which reduces the sample approximation of the expectation into queries of NP oracles via hashing and projection. We evaluate XOR-SGD on two real-world applications. The first stochastic inventory management problem searches for a robust inventory management plan in preparation for the virus pandemics, natural disasters, etc. The second network design problem decides an optimal land conservation plan which promotes the free movement of wild-life animals. We show that our approach finds better solutions with drastically fewer samples needed compared to a couple of state-of-the-art solvers.
Cite
Text
Ding and Xue. "XOR-SGD: Provable Convex Stochastic Optimization for Decision-Making Under Uncertainty." Uncertainty in Artificial Intelligence, 2021.Markdown
[Ding and Xue. "XOR-SGD: Provable Convex Stochastic Optimization for Decision-Making Under Uncertainty." Uncertainty in Artificial Intelligence, 2021.](https://mlanthology.org/uai/2021/ding2021uai-xorsgd/)BibTeX
@inproceedings{ding2021uai-xorsgd,
title = {{XOR-SGD: Provable Convex Stochastic Optimization for Decision-Making Under Uncertainty}},
author = {Ding, Fan and Xue, Yexiang},
booktitle = {Uncertainty in Artificial Intelligence},
year = {2021},
pages = {151-160},
volume = {161},
url = {https://mlanthology.org/uai/2021/ding2021uai-xorsgd/}
}