Towards Faster Planning with Continuous Resources in Stochastic Domains
Abstract
Agents often have to construct plans that obey resource lim-its for continuous resources whose consumption can only be characterized by probability distributions. While Markov De-cision Processes (MDPs) with a state space of continuous and discrete variables are popular for modeling these domains, current algorithms for such MDPs can exhibit poor perfor-mance with a scale-up in their state space. To remedy that we propose an algorithm called DPFP. DPFP’s key contribu-tion is its exploitation of the dual space cumulative distribu-tion functions. This dual formulation is key to DPFP’s novel combination of three features. First, it enables DPFP’s mem-bership in a class of algorithms that perform forward search in a large (possibly infinite) policy space. Second, it provides a new and efficient approach for varying the policy genera-tion effort based on the likelihood of reaching different re-gions of the MDP state space. Third, it yields a bound on the error produced by such approximations. These three fea-tures conspire to allow DPFP’s superior performance and sys-tematic trade-off of optimality for speed. Our experimental evaluation shows that, when run stand-alone, DPFP outper-forms other algorithms in terms of its any-time performance, whereas when run as a hybrid, it allows for a significant speedup of a leading continuous resource MDP solver.
Cite
Text
Marecki and Tambe. "Towards Faster Planning with Continuous Resources in Stochastic Domains." AAAI Conference on Artificial Intelligence, 2008.Markdown
[Marecki and Tambe. "Towards Faster Planning with Continuous Resources in Stochastic Domains." AAAI Conference on Artificial Intelligence, 2008.](https://mlanthology.org/aaai/2008/marecki2008aaai-faster/)BibTeX
@inproceedings{marecki2008aaai-faster,
title = {{Towards Faster Planning with Continuous Resources in Stochastic Domains}},
author = {Marecki, Janusz and Tambe, Milind},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2008},
pages = {1049-1055},
url = {https://mlanthology.org/aaai/2008/marecki2008aaai-faster/}
}