Amir Habibian
Amir Habibian
Home
News
Publications
Talks
Light
Dark
Automatic
Optimizing Self-Attention
Skip-Attention: Improving Vision Transformers by Paying Less Attention
This work aims to improve the efficiency of vision transformers (ViT). While ViTs use computationally expensive self-attention …
Shashanka Venkataramanan
,
Amir Ghodrati
,
Yuki M Asano
,
Fatih Porikli
,
Amirhossein Habibian
PDF
Cite
Cite
×