Subspace-Configurable Networks

Abstract

While the deployment of deep learning models on edge devices is increasing, these models often lack robustness when faced with dynamic changes in sensed data. This can be attributed to sensor drift, or variations in the data compared to what was used during offline training due to factors such as specific sensor placement or naturally changing sensing conditions. Hence, achieving the desired robustness necessitates the utilization of either an invariant architecture or specialized training approaches, like data augmentation techniques. Alternatively, input transformations can be treated as a domain shift problem, and solved by post-deployment model adaptation. In this paper, we train a parameterized subspace of configurable networks, where an optimal network for a particular parameter setting is part of this subspace. The obtained subspace is low-dimensional and has a surprisingly simple structure even for complex, non-invertible transformations of the input, leading to an exceptionally high efficiency of subspace-configurable networks (SCNs) when limited storage and computing resources are at stake.

Cite

Text

Wang et al. "Subspace-Configurable Networks." Proceedings of The 3rd Conference on Lifelong Learning Agents, 2024.

Markdown

[Wang et al. "Subspace-Configurable Networks." Proceedings of The 3rd Conference on Lifelong Learning Agents, 2024.](https://mlanthology.org/collas/2024/wang2024collas-subspaceconfigurable/)

BibTeX

@inproceedings{wang2024collas-subspaceconfigurable,
  title     = {{Subspace-Configurable Networks}},
  author    = {Wang, Dong and Saukh, Olga and He, Xiaoxi and Thiele, Lothar},
  booktitle = {Proceedings of The 3rd Conference on Lifelong Learning Agents},
  year      = {2024},
  pages     = {221-251},
  volume    = {274},
  url       = {https://mlanthology.org/collas/2024/wang2024collas-subspaceconfigurable/}
}