Towards Vector Optimization on Low-Dimensional Vector Symbolic Architecture
Abstract
Vector Symbolic Architecture (VSA) is emerging in machine learning due to its efficiency, but they are hindered by issues of hyperdimensionality and accuracy. As a promising mitigation, the Low-Dimensional Computing (LDC) method significantly reduces the vector dimension by $\sim$100 times while maintaining accuracy, by employing a gradient-based optimization. Despite its potential, LDC optimization for VSA is still underexplored. Our investigation into vector updates underscores the importance of stable, adaptive dynamics in LDC training. We also reveal the overlooked yet critical roles of batch normalization (BN) and knowledge distillation (KD) in standard approaches. Besides the accuracy boost, BN does not add computational overhead during inference, and KD significantly enhances inference confidence. Through extensive experiments and ablation studies across multiple benchmarks, we provide a thorough evaluation of our approach and extend the interpretability of binary neural network optimization similar to LDC, previously unaddressed in BNN literature.
Cite
Text
Duan et al. "Towards Vector Optimization on Low-Dimensional Vector Symbolic Architecture." Conference on Parsimony and Learning, 2025.Markdown
[Duan et al. "Towards Vector Optimization on Low-Dimensional Vector Symbolic Architecture." Conference on Parsimony and Learning, 2025.](https://mlanthology.org/cpal/2025/duan2025cpal-vector/)BibTeX
@inproceedings{duan2025cpal-vector,
title = {{Towards Vector Optimization on Low-Dimensional Vector Symbolic Architecture}},
author = {Duan, Shijin and Liu, Yejia and Liu, Gaowen and Kompella, Ramana Rao and Ren, Shaolei and Xu, Xiaolin},
booktitle = {Conference on Parsimony and Learning},
year = {2025},
pages = {1413-1432},
volume = {280},
url = {https://mlanthology.org/cpal/2025/duan2025cpal-vector/}
}