Parmar, Niki

9 publications

TMLR 2022 Decoder Denoising Pretraining for Semantic Segmentation Emmanuel Asiedu Brempong, Simon Kornblith, Ting Chen, Niki Parmar, Matthias Minderer, Mohammad Norouzi
CVPRW 2022 Denoising Pretraining for Semantic Segmentation Emmanuel Asiedu Brempong, Simon Kornblith, Ting Chen, Niki Parmar, Matthias Minderer, Mohammad Norouzi
CVPR 2021 Bottleneck Transformers for Visual Recognition Aravind Srinivas, Tsung-Yi Lin, Niki Parmar, Jonathon Shlens, Pieter Abbeel, Ashish Vaswani
CVPR 2021 Scaling Local Self-Attention for Parameter Efficient Visual Backbones Ashish Vaswani, Prajit Ramachandran, Aravind Srinivas, Niki Parmar, Blake Hechtman, Jonathon Shlens
NeurIPS 2019 Stand-Alone Self-Attention in Vision Models Prajit Ramachandran, Niki Parmar, Ashish Vaswani, Irwan Bello, Anselm Levskaya, Jon Shlens
ICML 2018 Fast Decoding in Sequence Models Using Discrete Latent Variables Lukasz Kaiser, Samy Bengio, Aurko Roy, Ashish Vaswani, Niki Parmar, Jakob Uszkoreit, Noam Shazeer
ICML 2018 Image Transformer Niki Parmar, Ashish Vaswani, Jakob Uszkoreit, Lukasz Kaiser, Noam Shazeer, Alexander Ku, Dustin Tran
NeurIPS 2018 Mesh-TensorFlow: Deep Learning for Supercomputers Noam Shazeer, Youlong Cheng, Niki Parmar, Dustin Tran, Ashish Vaswani, Penporn Koanantakool, Peter Hawkins, HyoukJoong Lee, Mingsheng Hong, Cliff Young, Ryan Sepassi, Blake Hechtman
NeurIPS 2017 Attention Is All You Need Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Ɓukasz Kaiser, Illia Polosukhin