Progressive Abstraction Refinement for Sparse Sampling
Abstract
Monte Carlo tree search (MCTS) algorithms can encounter difficulties when solving Markov decision problems (MDPs) in which the outcomes of actions are highly stochastic. This stochastic branching can be reduced through state abstraction. In online planning with a time budget, there is a complex tradeoff between the loss in performance due to overly coarse abstraction versus the gain in performance from reducing the problem size. We find empirically that very coarse and unsound abstractions often outperform sound abstractions for practical planning budgets. Motivated by this, we propose a progressive abstraction refinement algorithm that refines an initially coarse abstraction during search in order to match the abstraction granularity to the sample budget. Our experiments demonstrate the strong performance of search with coarse abstractions, and show that our proposed algorithm combines the benefits of coarse abstraction at small sample budgets with the ability to exploit larger budgets for further performance gains.
Cite
Text
Hostetler et al. "Progressive Abstraction Refinement for Sparse Sampling." Conference on Uncertainty in Artificial Intelligence, 2015.Markdown
[Hostetler et al. "Progressive Abstraction Refinement for Sparse Sampling." Conference on Uncertainty in Artificial Intelligence, 2015.](https://mlanthology.org/uai/2015/hostetler2015uai-progressive/)BibTeX
@inproceedings{hostetler2015uai-progressive,
title = {{Progressive Abstraction Refinement for Sparse Sampling}},
author = {Hostetler, Jesse and Fern, Alan and Dietterich, Thomas G.},
booktitle = {Conference on Uncertainty in Artificial Intelligence},
year = {2015},
pages = {365-374},
url = {https://mlanthology.org/uai/2015/hostetler2015uai-progressive/}
}