Learning High-Dimensional Mixed Models via Amortized Variational Inference
Abstract
Modelling longitudinal data is an important yet challenging task. These datasets can be high-dimensional, consist of non-linear effects, and contain time-varying covariates. In this work, we leverage linear mixed models (LMMs) and amortized variational inference to provide conditional priors for VAEs, and propose LMM-VAE, a model that is scalable, interpretable, and shares theoretical connections to the GP-based VAEs. We empirically demonstrate that LMM-VAE performs competitively compared to existing approaches.
Cite
Text
Ong et al. "Learning High-Dimensional Mixed Models via Amortized Variational Inference." ICML 2024 Workshops: SPIGM, 2024.Markdown
[Ong et al. "Learning High-Dimensional Mixed Models via Amortized Variational Inference." ICML 2024 Workshops: SPIGM, 2024.](https://mlanthology.org/icmlw/2024/ong2024icmlw-learning/)BibTeX
@inproceedings{ong2024icmlw-learning,
title = {{Learning High-Dimensional Mixed Models via Amortized Variational Inference}},
author = {Ong, Priscilla and Haussmann, Manuel and Lähdesmäki, Harri},
booktitle = {ICML 2024 Workshops: SPIGM},
year = {2024},
url = {https://mlanthology.org/icmlw/2024/ong2024icmlw-learning/}
}