Decision Making Under Uncertainty: Operations Research Meets AI (Again)

Abstract

Models for sequential decision making under uncertainty (e.g., Markov decision processes, or MDPs) have been studied in operations research for decades. The recent incorporation of ideas from many areas of AI, including planning, probabilistic modeling, machine learning, and knowledge representation) have made these models much more widely applicable. I briefly survey recent advances within AI in the use of fullyand partially-observable MDPs as a modeling tool, and the development of computationally-manageable solution methods. I will place special emphasis on factored problem representations such as Bayesian networks and algorithms that exploit the structure inherent in these representations.

Cite

Text

Boutilier. "Decision Making Under Uncertainty: Operations Research Meets AI (Again)." AAAI Conference on Artificial Intelligence, 2000.

Markdown

[Boutilier. "Decision Making Under Uncertainty: Operations Research Meets AI (Again)." AAAI Conference on Artificial Intelligence, 2000.](https://mlanthology.org/aaai/2000/boutilier2000aaai-decision/)

BibTeX

@inproceedings{boutilier2000aaai-decision,
  title     = {{Decision Making Under Uncertainty: Operations Research Meets AI (Again)}},
  author    = {Boutilier, Craig},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2000},
  pages     = {1145-1150},
  url       = {https://mlanthology.org/aaai/2000/boutilier2000aaai-decision/}
}