Calibrating Distance Metrics Under Uncertainty
Abstract
Estimating distance metrics for given data samples is essential in machine learning algorithms with various applications. Accurately determining the metric becomes impossible if there are observation noises or missing values. In this work, we proposed an approach to calibrating distance metrics. Compared with standard practices that primarily reside on data imputation, our proposal makes fewer assumptions about the data. It provides a solid theoretical guarantee in improving the quality of the estimate. We developed a simple, efficient, yet effective computing procedure that scales up to realize the calibration process. The experimental results from a series of empirical evaluations justified the benefits of the proposed approach and demonstrated its high potential in practical applications.
Cite
Text
Li and Yu. "Calibrating Distance Metrics Under Uncertainty." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2022. doi:10.1007/978-3-031-26409-2_14Markdown
[Li and Yu. "Calibrating Distance Metrics Under Uncertainty." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2022.](https://mlanthology.org/ecmlpkdd/2022/li2022ecmlpkdd-calibrating/) doi:10.1007/978-3-031-26409-2_14BibTeX
@inproceedings{li2022ecmlpkdd-calibrating,
title = {{Calibrating Distance Metrics Under Uncertainty}},
author = {Li, Wenye and Yu, Fangchen},
booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
year = {2022},
pages = {219-234},
doi = {10.1007/978-3-031-26409-2_14},
url = {https://mlanthology.org/ecmlpkdd/2022/li2022ecmlpkdd-calibrating/}
}