ELTA: An Enhancer Against Long-Tail for Aesthetics-Oriented Models
Abstract
Real-world datasets often exhibit long-tailed distributions, compromising the generalization and fairness of learning-based models. This issue is particularly pronounced in Image Aesthetics Assessment (IAA) tasks, where such imbalance is difficult to mitigate due to a severe distribution mismatch between features and labels, as well as the great sensitivity of aesthetics to image variations. To address these issues, we propose an Enhancer against Long-Tail for Aesthetics-oriented models (ELTA). ELTA first utilizes a dedicated mixup technique to enhance minority feature representation in high-level space while preserving their intrinsic aesthetic qualities. Next, it aligns features and labels through a similarity consistency approach, effectively alleviating the distribution mismatch. Finally, ELTA adopts a specific strategy to refine the output distribution, thereby enhancing the quality of pseudo-labels. Experiments on four representative datasets (AVA, AADB, TAD66K, and PARA) show that our proposed ELTA achieves state-of-the-art performance by effectively mitigating the long-tailed issue in IAA datasets. Moreover, ELTA is designed with plug-and-play capabilities for seamless integration with existing methods. To our knowledge, this is the first contribution in the IAA community addressing long-tail. All resources are available in here.
Cite
Text
Liu et al. "ELTA: An Enhancer Against Long-Tail for Aesthetics-Oriented Models." International Conference on Machine Learning, 2024.Markdown
[Liu et al. "ELTA: An Enhancer Against Long-Tail for Aesthetics-Oriented Models." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/liu2024icml-elta/)BibTeX
@inproceedings{liu2024icml-elta,
title = {{ELTA: An Enhancer Against Long-Tail for Aesthetics-Oriented Models}},
author = {Liu, Limin and He, Shuai and Ming, Anlong and Xie, Rui and Ma, Huadong},
booktitle = {International Conference on Machine Learning},
year = {2024},
pages = {31106-31118},
volume = {235},
url = {https://mlanthology.org/icml/2024/liu2024icml-elta/}
}