Abstraction in Predictive State Representations

Abstract

Most work on Predictive Representations of State (PSRs) focuses on learning a complete model of the system that can be used to answer any question about the future. However, we may be interested only in answering certain kinds of abstract questions. For instance, we may only care about the presence of objects in an image rather than pixel level details. In such cases, we may be able to learn substantially smaller models that answer only such abstract questions. We present the framework of PSR homomorphisms for model abstraction in PSRs. A homomorphism transforms a given PSR into a smaller PSR that provides exact answers to abstract questions in the original PSR. As we shall show, this transformation captures structural and temporal abstractions in the original PSR.

Cite

Text

Soni and Singh. "Abstraction in Predictive State Representations." AAAI Conference on Artificial Intelligence, 2007.

Markdown

[Soni and Singh. "Abstraction in Predictive State Representations." AAAI Conference on Artificial Intelligence, 2007.](https://mlanthology.org/aaai/2007/soni2007aaai-abstraction/)

BibTeX

@inproceedings{soni2007aaai-abstraction,
  title     = {{Abstraction in Predictive State Representations}},
  author    = {Soni, Vishal and Singh, Satinder},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2007},
  pages     = {639-644},
  url       = {https://mlanthology.org/aaai/2007/soni2007aaai-abstraction/}
}