Fully Test-Time Adaptation for Feature Decrement in Tabular Data
Abstract
Tabular data is widely adopted in various machine learning tasks. Current tabular data learning mainly focuses on closed environments, while in real-world applications, open environments are often encountered, where distribution shifts and feature decrements occur, leading to severe performance degradation. Previous studies have primarily focused on addressing distribution shifts, while feature decrements, a unique challenge in tabular data learning, have received relatively little attention. In this paper, we present the first comprehensive study on the problem of Fully Test-Time Adaptation for Feature Decrement in Tabular Data. Through empirical analysis, we identify the suboptimality of existing missing-feature imputation methods and the limited applicability of missing-feature adaptation approaches. To address these challenges, we propose a novel method, LLM-IMPUTE, which leverages Large Language Models (LLMs) to impute missing features without relying on training data. Furthermore, we introduce Augmented-Training LLM (ATLLM), a method designed to enhance the robustness of feature decrements by simulating feature-decrement scenarios during the training phase to address tasks that can not be imputed by LLM-IMPUTE. Extensive experimental results demonstrate that our proposal significantly improves both performance and robustness in missing feature imputation and adaptation scenarios.
Cite
Text
Cheng et al. "Fully Test-Time Adaptation for Feature Decrement in Tabular Data." International Joint Conference on Artificial Intelligence, 2025. doi:10.24963/IJCAI.2025/550Markdown
[Cheng et al. "Fully Test-Time Adaptation for Feature Decrement in Tabular Data." International Joint Conference on Artificial Intelligence, 2025.](https://mlanthology.org/ijcai/2025/cheng2025ijcai-fully/) doi:10.24963/IJCAI.2025/550BibTeX
@inproceedings{cheng2025ijcai-fully,
title = {{Fully Test-Time Adaptation for Feature Decrement in Tabular Data}},
author = {Cheng, Zi-Jian and Jia, Zi-Yi and Yu, Kun-Yang and Zhou, Zhi and Guo, Lan-Zhe},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2025},
pages = {4940-4948},
doi = {10.24963/IJCAI.2025/550},
url = {https://mlanthology.org/ijcai/2025/cheng2025ijcai-fully/}
}