Bayesian Convolutional Deep Sets with Task-Dependent Stationary Prior

Abstract

Convolutional deep sets is a neural network architecture that can model stationary stochastic processes. This architecture uses the kernel smoother and deep convolutional neural network to construct translation equivariant functional representations. However, the non-parametric nature of the kernel smoother can produce ambiguous representations when the number of data points is not given sufficiently. To address this issue, we introduce bayesian convolutional deep sets, which constructs random translation equivariant functional representations with a stationary prior. Furthermore, we present how to impose the task-dependent prior for each dataset because a wrongly imposed prior can result in an even worse representation than that of the kernel smoother. Empirically, we demonstrate that the proposed architecture alleviates the targeted issue in various experiments with time-series and image datasets.

Cite

Text

Jung and Park. "Bayesian Convolutional Deep Sets with Task-Dependent Stationary Prior." Artificial Intelligence and Statistics, 2023.

Markdown

[Jung and Park. "Bayesian Convolutional Deep Sets with Task-Dependent Stationary Prior." Artificial Intelligence and Statistics, 2023.](https://mlanthology.org/aistats/2023/jung2023aistats-bayesian/)

BibTeX

@inproceedings{jung2023aistats-bayesian,
  title     = {{Bayesian Convolutional Deep Sets with Task-Dependent Stationary Prior}},
  author    = {Jung, Yohan and Park, Jinkyoo},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2023},
  pages     = {3795-3824},
  volume    = {206},
  url       = {https://mlanthology.org/aistats/2023/jung2023aistats-bayesian/}
}