The (Un)Scalability of Heuristic Approximators for NP-Hard Search Problems
Abstract
The A* algorithm is commonly used to solve \cNP-hard combinatorial optimization problems. When provided with a completely informed heuristic function, A* solves many \cNP-hard minimum-cost path problems in time polynomial in the branching factor and the number of edges in a minimum-cost path. Thus, approximating their completely informed heuristic functions with high precision is \cNP-hard. We, therefore, examine recent publications that propose the use of neural networks for this purpose. We support our claim that these approaches do not scale to large instance sizes both theoretically and experimentally. Our first experimental results for three representative \cNP-hard minimum-cost path problems suggest that using neural networks to approximate completely informed heuristic functions with high precision might result in network sizes that scale exponentially in the instance sizes. The research community might thus benefit from investigating other ways of integrating heuristic search with machine learning.
Cite
Text
Pendurkar et al. "The (Un)Scalability of Heuristic Approximators for NP-Hard Search Problems." NeurIPS 2022 Workshops: ICBINB, 2022.Markdown
[Pendurkar et al. "The (Un)Scalability of Heuristic Approximators for NP-Hard Search Problems." NeurIPS 2022 Workshops: ICBINB, 2022.](https://mlanthology.org/neuripsw/2022/pendurkar2022neuripsw-un/)BibTeX
@inproceedings{pendurkar2022neuripsw-un,
title = {{The (Un)Scalability of Heuristic Approximators for NP-Hard Search Problems}},
author = {Pendurkar, Sumedh and Huang, Taoan and Koenig, Sven and Sharon, Guni},
booktitle = {NeurIPS 2022 Workshops: ICBINB},
year = {2022},
url = {https://mlanthology.org/neuripsw/2022/pendurkar2022neuripsw-un/}
}