Multi-Label Dimensionality Reduction via Dependence Maximization
Abstract
Multi-label learning deals with data associated with multi-ple labels simultaneously. Like other machine learning and data mining tasks, multi-label learning also suffers from the curse of dimensionality. Although dimensionality reduction has been studied for many years, multi-label dimensionality reduction remains almost untouched. In this paper, we pro-pose a multi-label dimensionality reduction method, MDDM, which attempts to project the original data into a lower-dimensional feature space maximizing the dependence be-tween the original feature description and the associated class labels. Based on the Hilbert-Schmidt Independence Crite-rion, we derive a closed-form solution which enables the di-mensionality reduction process to be efficient. Experiments validate the performance of MDDM.
Cite
Text
Zhang and Zhou. "Multi-Label Dimensionality Reduction via Dependence Maximization." AAAI Conference on Artificial Intelligence, 2008.Markdown
[Zhang and Zhou. "Multi-Label Dimensionality Reduction via Dependence Maximization." AAAI Conference on Artificial Intelligence, 2008.](https://mlanthology.org/aaai/2008/zhang2008aaai-multi-a/)BibTeX
@inproceedings{zhang2008aaai-multi-a,
title = {{Multi-Label Dimensionality Reduction via Dependence Maximization}},
author = {Zhang, Yin and Zhou, Zhi-Hua},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2008},
pages = {1503-1505},
url = {https://mlanthology.org/aaai/2008/zhang2008aaai-multi-a/}
}