Tag: Self-Attention

AI Machine Learning & Data Science Research

Microsoft’s FocalNets Replace ViTs’ Self-Attention With Focal Modulation to Improve Visual Modelling

A Microsoft Research team proposes FocalNet (Focal Modulation Network), a simple and attention-free architecture designed to replace transformers’ self-attention module. FocalNets exhibit significant superiority over self-attention for effective and efficient visual modelling in real-world applications.

AI Machine Learning & Data Science Popular Research

Integrating Self-Attention and Convolution: Tsinghua, Huawei & BAAI’s ACmix Achieves SOTA Performance on CV Tasks With Minimum Cost

In the new paper On the Integration of Self-Attention and Convolution, a research team from Tsinghua University, Huawei Technologies Ltd. and the Beijing Academy of Artificial Intelligence proposes ACmix, a mixed model that leverages the benefits of both self-attention and convolution for computer vision representation tasks while achieving minimum computational overhead compared to its pure convolution or self-attention counterparts.