Ansatz-Agnostic Exponential Resource Saving in Variational Quantum Algorithms Using Shallow Shadows
Abstract
Graph Transformer has shown great promise in capturing the dynamics of user preferences for sequential recommendations. However, the self-attention mechanism within its structure is of quadratic complexity, posing challenges for deployment on devices with limited resources. To this end, we propose a Communal Linear Attention-enhanced Graph TransFormer for lightweight sequential recommendation, namely CoLA-Former. Specifically, we introduce a Communal Linear Attention (CoLAttention) mechanism. It utilizes low-rank yet reusable communal units to calculate the global correlations on sequential graphs. The weights from the units are also made communal across different training batches, enabling inter-batch global weighting. Moreover, we devise a low-rank approximation component. It utilizes weights distillation to reduce the scale of the trainable parameters in the Graph Transformer network. Extensive experimental results on three real-world datasets demonstrate that the proposed CoLA-Former significantly outperforms twelve state-of-the-art methods in accuracy and efficiency. The datasets and codes are available at https://github.com/ZZY-GraphMiningLab/CoLA_Former.
Cite
Text
Basheer et al. "Ansatz-Agnostic Exponential Resource Saving in Variational Quantum Algorithms Using Shallow Shadows." International Joint Conference on Artificial Intelligence, 2024. doi:10.24963/ijcai.2024/410Markdown
[Basheer et al. "Ansatz-Agnostic Exponential Resource Saving in Variational Quantum Algorithms Using Shallow Shadows." International Joint Conference on Artificial Intelligence, 2024.](https://mlanthology.org/ijcai/2024/basheer2024ijcai-ansatz/) doi:10.24963/ijcai.2024/410BibTeX
@inproceedings{basheer2024ijcai-ansatz,
title = {{Ansatz-Agnostic Exponential Resource Saving in Variational Quantum Algorithms Using Shallow Shadows}},
author = {Basheer, Afrad and Feng, Yuan and Ferrie, Christopher and Li, Sanjiang},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2024},
pages = {3706-3714},
doi = {10.24963/ijcai.2024/410},
url = {https://mlanthology.org/ijcai/2024/basheer2024ijcai-ansatz/}
}