Metric Nearness Made Practical
Abstract
Given a square matrix with noisy dissimilarity measures between pairs of data samples, the metric nearness model computes the best approximation of the matrix from a set of valid distance metrics. Despite its wide applications in machine learning and data processing tasks, the model faces non-trivial computational requirements in seeking the solution due to the large number of metric constraints associated with the feasible region. Our work designed a practical approach in two stages to tackle the challenge and improve the model's scalability and applicability. The first stage computes a fast yet high-quality approximate solution from a set of isometrically embeddable metrics, further improved by an effective heuristic. The second stage refines the approximate solution with the Halpern-Lions-Wittmann-Bauschke projection algorithm, which converges quickly to the optimal solution. In empirical evaluations, the proposed approach runs at least an order of magnitude faster than the state-of-the-art solutions, with significantly improved scalability, complete conformity to constraints, less memory consumption, and other desirable features in real applications.
Cite
Text
Li et al. "Metric Nearness Made Practical." AAAI Conference on Artificial Intelligence, 2023. doi:10.1609/AAAI.V37I7.26041Markdown
[Li et al. "Metric Nearness Made Practical." AAAI Conference on Artificial Intelligence, 2023.](https://mlanthology.org/aaai/2023/li2023aaai-metric/) doi:10.1609/AAAI.V37I7.26041BibTeX
@inproceedings{li2023aaai-metric,
title = {{Metric Nearness Made Practical}},
author = {Li, Wenye and Yu, Fangchen and Ma, Zichen},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2023},
pages = {8648-8656},
doi = {10.1609/AAAI.V37I7.26041},
url = {https://mlanthology.org/aaai/2023/li2023aaai-metric/}
}