Using Deep Structure to Locate Hard Problems

Abstract

One usually writes A.I. programs to be used on a range of examples which, although similar in kind, differ in detail. This paper shows how to predict where, in a space of problem instances, the hardest problems are to be found and where the fluctuations in difficulty are greatest. Our key insight is to shift emphasis from modelling sophisticated algorithms directly to modelling a search space which captures their principal effects. This allows us to analyze complex A.I. problems in a simple and intuitive way. We present a sample analysis, compare our model's quantitative predictions with data obtained independently and describe how to exploit the results to estimate the value of preprocessing. Finally, we circumscribe the kind problems to which the methodology is suited. Introduction The qualitative existence of abrupt changes in computational cost has been predicted theoretically in (Purdom 1983, Franco & Paull 1983, Huberman & Hogg 1987) and observed empirically in (Pa...

Cite

Text

Williams and Hogg. "Using Deep Structure to Locate Hard Problems." AAAI Conference on Artificial Intelligence, 1992.

Markdown

[Williams and Hogg. "Using Deep Structure to Locate Hard Problems." AAAI Conference on Artificial Intelligence, 1992.](https://mlanthology.org/aaai/1992/williams1992aaai-using/)

BibTeX

@inproceedings{williams1992aaai-using,
  title     = {{Using Deep Structure to Locate Hard Problems}},
  author    = {Williams, Colin P. and Hogg, Tad},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {1992},
  pages     = {472-477},
  url       = {https://mlanthology.org/aaai/1992/williams1992aaai-using/}
}