The phenomenal success of Google’s BERT and other natural language processing (NLP) models based on transformers isn’t accidental. Behind all the SOTA performances lies transformers’ innovative self-attention mechanism, which enables networks to capture contextual information from an entire text sequence. However, the memory and computational requirements of self-attention grow quadratically with sequence length, making it very expensive to use transformer-based models for processing long sequences.
To alleviate the quadratic dependency of transformers, a team of researchers from Google Research recently proposed a new sparse attention mechanism dubbed BigBird. In their paper Big Bird: Transformers for Longer Sequences, the team demonstrates that despite being a sparse attention mechanism, BigBird preserves all known theoretical properties of quadratic full attention models. In experiments, BigBird is shown to dramatically improve performance across long-context NLP tasks, producing SOTA results in question answering and summarization.

The researchers designed BigBird to satisfy all known theoretical properties of full transformers, building three main components into the model:
- A set of g global tokens that attend to all parts of a sequence.
- For each query qi , a set of r random keys that each query will attend to.
- A block of local neighbours w so that each node attends on their local structure
These innovations enable BigBird to handle sequences up to eight times longer than what was previously possible using standard hardware.

Additionally, inspired by the capability of BigBird to handle long contexts, the team introduced a novel application of attention-based models for extracting contextual representations of genomics sequences like DNA. In experiments, BigBird proved to be beneficial in processing the longer input sequences and also delivered improved performance on downstream tasks such as promoter-region and chromatin profile prediction.



The paper Big Bird: Transformers for Longer Sequences is on arXiv.
Reporter: Fangyu Cai | Editor: Michael Sarazen

Synced Report | A Survey of China’s Artificial Intelligence Solutions in Response to the COVID-19 Pandemic — 87 Case Studies from 700+ AI Vendors
This report offers a look at how the Chinese government and business owners have leveraged artificial intelligence technologies in the battle against COVID-19. It is also available on Amazon Kindle.
Click here to find more reports from us.
We know you don’t want to miss any story. Subscribe to our popular Synced Global AI Weekly to get weekly AI updates.

Pingback: Google ‘BigBird’ Achieves SOTA Performance on Long-Context NLP Tasks — FLMarket
Pingback: Google ‘BigBird’ Achieves SOTA Performance on Long-Context NLP Tasks | Hacker News
Pingback: Google ‘BigBird’ Achieves SOTA Performance on Long-Context NLP Tasks – Hacker News Robot
Pingback: Google的“ BigBird”在长内容NLP任务上获得SOTA性能 – HackBase
Pingback: Google ‘BigBird’ Achieves SOTA Performance on Long-Context NLP Tasks | Synced - Buzzing Startups
Pingback: Google ‘BigBird’ Achieves SOTA Performance on Long-Context NLP Duties - ProWebLinks
Pingback: Relevance metrics in SEO – Tabtimize
Pingback: Could Google passage indexing be leveraging BERT?
Pingback: Could Google passage indexing be leveraging BERT? - Client Secure Media, Inc.
Pingback: NeurIPS 2020 | Teaching Transformers New Tricks | Synced
BigBird proved to be beneficial in processing the longer input sequences, err well yer, some very impressive stuff coming from them
Pingback: UC Berkeley & Google’s BoTNet Applies Self-Attention to CV Bottlenecks | UC Berkeley & Google's BoTNet Applies Self-Attention to CV BottlenecksSynced
Pingback: UC Berkeley & Google’s BoTNet Applies Self-Attention to CV Bottlenecks | NEO Share
Good one.
Nicely written
Thanks for the helpful tip
This is highly appretiated.
Very well written
This is a great article
Thank you for this great information
Wow. This is indeed a great milestone for google.
The truth is, google has really set herself to stand out in all ramifications so long as ICT is concerned. The are more like a great for to contend with and always look up to.
Their achievements are simply remarkable
Google has been at the top of their game and it’s simply awesome.
Thanks for sharing this wonderful article.
happy you shared this information…
Nice article thanks a lot
This article was interesting
Thanks for this great post, please, what are the tips on How to Become Successful & Rich thanks
Thanks for this great post, please, I need Information for those who are yet to be admitted on JAMB CAPS thanks.
Wonderful piece of article, Well done!
The truth is, google has really set herself to stand out in all ramifications so long as ICT is concerned. The are more like a great for to contend with and always look up to.
Their achievements are simply remarkable For Legit scholarship Visit Smysor Scholarship Windsor Illinois
NPC Shortlisted Candidates 2022 Is Out | How To Check Your Application Status Now NPC Shortlisted Candidates
Check out for more information @ Nigeria Prison Service Recruitment
Thanks for this wonderful article.
Kerala Lottery Result Today Win Win
Thanks for the nice article
Nice post!
Nice article
How interesting this content is. I also find technological innovation in modern libraries a good read.
Fantastic blog post. I was checking constantly to this weblog
South Eastern Kenya University Website
Great content. I’ll be learning from here