Symbolic Heuristic Search for Factored Markov Decision Processes
Abstract
We describe a plnning algorithm that integrates two approaches to solving Markov decision processes with large state spaces. State abstraction is used to avoid evaluating states individually. Forward search from a start state, guided by an admissible heuristic, is used to avoid evaluating all states. We combine these two approaches in a novel way that exploits symbolic model-checking techniques and demonstrates their usefulness for decision-theoretic planning.
Cite
Text
Feng and Hansen. "Symbolic Heuristic Search for Factored Markov Decision Processes." AAAI Conference on Artificial Intelligence, 2002. doi:10.5555/777092.777164Markdown
[Feng and Hansen. "Symbolic Heuristic Search for Factored Markov Decision Processes." AAAI Conference on Artificial Intelligence, 2002.](https://mlanthology.org/aaai/2002/feng2002aaai-symbolic/) doi:10.5555/777092.777164BibTeX
@inproceedings{feng2002aaai-symbolic,
title = {{Symbolic Heuristic Search for Factored Markov Decision Processes}},
author = {Feng, Zhengzhu and Hansen, Eric A.},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2002},
pages = {455-460},
doi = {10.5555/777092.777164},
url = {https://mlanthology.org/aaai/2002/feng2002aaai-symbolic/}
}