Rethinking Attention: Exploring Shallow Feed-Forward Neural Networks as an Alternative to Attention Layers in Transformers (Student Abstract)

Cite

Text

Dordevic et al. "Rethinking Attention: Exploring Shallow Feed-Forward Neural Networks as an Alternative to Attention Layers in Transformers (Student Abstract)." AAAI Conference on Artificial Intelligence, 2024. doi:10.1609/AAAI.V38I21.30436

Markdown

[Dordevic et al. "Rethinking Attention: Exploring Shallow Feed-Forward Neural Networks as an Alternative to Attention Layers in Transformers (Student Abstract)." AAAI Conference on Artificial Intelligence, 2024.](https://mlanthology.org/aaai/2024/dordevic2024aaai-rethinking/) doi:10.1609/AAAI.V38I21.30436

BibTeX

@inproceedings{dordevic2024aaai-rethinking,
  title     = {{Rethinking Attention: Exploring Shallow Feed-Forward Neural Networks as an Alternative to Attention Layers in Transformers (Student Abstract)}},
  author    = {Dordevic, Danilo and Bozic, Vukasin and Thommes, Joseph and Coppola, Daniele and Singh, Sidak Pal},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2024},
  pages     = {23477-23479},
  doi       = {10.1609/AAAI.V38I21.30436},
  url       = {https://mlanthology.org/aaai/2024/dordevic2024aaai-rethinking/}
}