Tree Cross Attention
Abstract
Cross Attention is a popular method for retrieving information from a set of context tokens for making predictions. At inference time, for each prediction, Cross Attention scans the full set of $\mathcal{O}(N)$ tokens. In practice, however, often only a small subset of tokens are required for good performance. Methods such as Perceiver IO are cheap at inference as they distill the information to a smaller-sized set of latent tokens $L < N$ on which cross attention is then applied, resulting in only $\mathcal{O}(L)$ complexity. However, in practice, as the number of input tokens and the amount of information to distill increases, the number of latent tokens needed also increases significantly. In this work, we propose Tree Cross Attention (TCA) - a module based on Cross Attention that only retrieves information from a logarithmic $\mathcal{O}(\log(N))$ number of tokens for performing inference. TCA organizes the data in a tree structure and performs a tree search at inference time to retrieve the relevant tokens for prediction. Leveraging TCA, we introduce ReTreever, a flexible architecture for token-efficient inference. We show empirically that Tree Cross Attention (TCA) performs comparable to Cross Attention across various classification and uncertainty regression tasks while being significantly more token-efficient. Furthermore, we compare ReTreever against Perceiver IO, showing significant gains while using the same number of tokens for inference.
Cite
Text
Feng et al. "Tree Cross Attention." International Conference on Learning Representations, 2024.Markdown
[Feng et al. "Tree Cross Attention." International Conference on Learning Representations, 2024.](https://mlanthology.org/iclr/2024/feng2024iclr-tree/)BibTeX
@inproceedings{feng2024iclr-tree,
title = {{Tree Cross Attention}},
author = {Feng, Leo and Tung, Frederick and Hajimirsadeghi, Hossein and Bengio, Yoshua and Ahmed, Mohamed Osama},
booktitle = {International Conference on Learning Representations},
year = {2024},
url = {https://mlanthology.org/iclr/2024/feng2024iclr-tree/}
}