Meta-Learning via PAC-Bayesian with Data-Dependent Prior: Generalization Bounds from Local Entropy
Abstract
Fuzzy logic programming is an established approach for reasoning under uncertainty. Several semantics from classical, two-valued logic programming have been generalized to the case of fuzzy logic programs. In this paper, we show that two of the most prominent classical semantics, namely the stable model and the well-founded semantics, can be reconstructed within the general framework of approximation fixpoint theory (AFT). This not only widens the scope of AFT from two- to many-valued logics, but allows a wide range of existing AFT results to be applied to fuzzy logic programming. As first examples of such applications, we clarify the formal relationship between existing semantics, generalize the notion of stratification from classical to fuzzy logic programs, and devise “more precise” variants of the semantics.
Cite
Text
Liu et al. "Meta-Learning via PAC-Bayesian with Data-Dependent Prior: Generalization Bounds from Local Entropy." International Joint Conference on Artificial Intelligence, 2024. doi:10.24963/ijcai.2024/506Markdown
[Liu et al. "Meta-Learning via PAC-Bayesian with Data-Dependent Prior: Generalization Bounds from Local Entropy." International Joint Conference on Artificial Intelligence, 2024.](https://mlanthology.org/ijcai/2024/liu2024ijcai-meta/) doi:10.24963/ijcai.2024/506BibTeX
@inproceedings{liu2024ijcai-meta,
title = {{Meta-Learning via PAC-Bayesian with Data-Dependent Prior: Generalization Bounds from Local Entropy}},
author = {Liu, Shiyu and Shi, Wei and Xu, Zenglin and Lv, Shaogao and Zhang, Yehong and Wang, Hui},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2024},
pages = {4578-4586},
doi = {10.24963/ijcai.2024/506},
url = {https://mlanthology.org/ijcai/2024/liu2024ijcai-meta/}
}