An Archive Can Bring Provable Speed-Ups in Multi-Objective Evolutionary Algorithms
Abstract
Deep neural networks have achieved remarkable performance across a variety of applications. However, their decision-making processes are opaque. In contrast, k-nearest neighbor (k-NN) provides interpretable predictions by relying on similar cases, but it lacks important capabilities of neural networks. The neural network k-nearest neighbor (NN-kNN) model is designed to bridge this gap, combining the benefits of neural networks with the instance-based interpretability of k-NN. However, the initial formulation of NN-kNN had limitations including scalability issues, reliance on surface-level features, and an excessive number of parameters. This paper improves NN-kNN by enhancing its scalability, parameter efficiency, ease of integration with feature extractors, and training simplicity. An evaluation of the revised architecture for image and language classification tasks illustrates its promise as a flexible and interpretable method.
Cite
Text
Bian et al. "An Archive Can Bring Provable Speed-Ups in Multi-Objective Evolutionary Algorithms." International Joint Conference on Artificial Intelligence, 2024. doi:10.24963/ijcai.2024/763Markdown
[Bian et al. "An Archive Can Bring Provable Speed-Ups in Multi-Objective Evolutionary Algorithms." International Joint Conference on Artificial Intelligence, 2024.](https://mlanthology.org/ijcai/2024/bian2024ijcai-archive/) doi:10.24963/ijcai.2024/763BibTeX
@inproceedings{bian2024ijcai-archive,
title = {{An Archive Can Bring Provable Speed-Ups in Multi-Objective Evolutionary Algorithms}},
author = {Bian, Chao and Ren, Shengjie and Li, Miqing and Qian, Chao},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2024},
pages = {6905-6913},
doi = {10.24963/ijcai.2024/763},
url = {https://mlanthology.org/ijcai/2024/bian2024ijcai-archive/}
}