Learning Set Functions with Implicit Differentiation

Abstract

Ou et al. [1] introduce the problem of learning set functions from data generated by a so-called optimal subset oracle. Their approach approximates the underlying utility function with an energy-based model. This approximation yields iterations of fixed-point updates during mean-field variational inference. However, as the number of iterations increases, automatic differentiation becomes computationally prohibitive due to the size of the Jacobians that are stacked during back-propagation. We address this challenge with implicit differentiation and examine the convergence conditions for the fixed-point iterations. We empirically demonstrate the efficiency of our method on subset selection applications including product recommendation and anomaly detection tasks.

Cite

Text

Özcan et al. "Learning Set Functions with Implicit Differentiation." ICML 2024 Workshops: Differentiable_Almost_Everything, 2024.

Markdown

[Özcan et al. "Learning Set Functions with Implicit Differentiation." ICML 2024 Workshops: Differentiable_Almost_Everything, 2024.](https://mlanthology.org/icmlw/2024/ozcan2024icmlw-learning/)

BibTeX

@inproceedings{ozcan2024icmlw-learning,
  title     = {{Learning Set Functions with Implicit Differentiation}},
  author    = {Özcan, Gözde and Shi, Chengzhi and Ioannidis, Stratis},
  booktitle = {ICML 2024 Workshops: Differentiable_Almost_Everything},
  year      = {2024},
  url       = {https://mlanthology.org/icmlw/2024/ozcan2024icmlw-learning/}
}