Fitness Distance Correlation of Neural Network Error Surfaces: A Scalable, Continuous Optimization Problem
Abstract
This paper investigates neural network training as a potential source of problems for benchmarking continuous, heuristic optimization algorithms. Through the use of a student-teacher learning paradigm, the error surfaces of several neural networks are examined using so-called fitness distance correlation, which has previously been applied to discrete, combinatorial optimization problems. The results suggest that the neural network training tasks offer a number of desirable properties for algorithm benchmarking, including the ability to scale-up to provide challenging problems in high-dimensional spaces.
Cite
Text
Gallagher. "Fitness Distance Correlation of Neural Network Error Surfaces: A Scalable, Continuous Optimization Problem." European Conference on Machine Learning, 2001. doi:10.1007/3-540-44795-4_14Markdown
[Gallagher. "Fitness Distance Correlation of Neural Network Error Surfaces: A Scalable, Continuous Optimization Problem." European Conference on Machine Learning, 2001.](https://mlanthology.org/ecmlpkdd/2001/gallagher2001ecml-fitness/) doi:10.1007/3-540-44795-4_14BibTeX
@inproceedings{gallagher2001ecml-fitness,
title = {{Fitness Distance Correlation of Neural Network Error Surfaces: A Scalable, Continuous Optimization Problem}},
author = {Gallagher, Marcus},
booktitle = {European Conference on Machine Learning},
year = {2001},
pages = {157-166},
doi = {10.1007/3-540-44795-4_14},
url = {https://mlanthology.org/ecmlpkdd/2001/gallagher2001ecml-fitness/}
}