Machine Learning & Data Science Nature Language Tech Popular

Amazon Alexa AI’s ‘Language Model Is All You Need’ Explores NLU as QA

Amazon Alexa AI paper asks whether NLU problems could be mapped to question-answering (QA) problems using transfer learning.

New research from Amazon Alexa AI posits that current natural language understanding (NLU) approaches are far from how humans understand language, and asks whether all NLU problems could be efficiently and effectively mapped to question-answering (QA) problems using transfer learning.

Transfer learning is an ML approach for applying knowledge learned from a source domain to a target domain. It has produced promising results in natural language processing (NLP), particularly when transferring learning from high data domains to low data domains. The Amazon researchers focus on a specific type of transfer learning, where the target domain is first mapped to the source domain.

Screen Shot 2020-11-09 at 10.02.22 AM.png

NLU is taken as determining intent and slot or entity value in natural language utterances. The proposed “QANLU” approach builds slot and intent detection questions and answers based on NLU annotated data. QA models are first trained on QA corpora then fine-tuned on questions and answers created from the NLU annotated data. Through transfer learning, this contextual question-answering knowledge is then used for finding intents or slot values in text inputs.

Unlike previous approaches, QANLU focuses on low resource applications and does not require the design and training of new model architectures or extensive data preprocessing. This enables it to achieve strong results in slot and intent detection with an order of magnitude less data.

Screen Shot 2020-11-09 at 10.11.53 AM.png

The researchers conducted experiments on the ATIS and Restaurants-8k datasets, with QANLU in low data regimes and few-shot settings significantly outperforming sentence classification and token tagging approaches for intent and slot detection tasks, while also bettering the new IC/SF few-shot approach’s performance in NLU.

The researchers say future directions could include expanding beyond this configuration and across different NLP problems, measuring the transfer of knowledge across different NLP tasks, and studying how QANLU questions might be generated automatically based on context.

The paper Language Model Is All You Need: Natural Language Understanding as Question Answering is on arXiv.

Analyst: Yuqing Li | Editor: Michael Sarazen


Synced Report | A Survey of China’s Artificial Intelligence Solutions in Response to the COVID-19 Pandemic — 87 Case Studies from 700+ AI Vendors

This report offers a look at how China has leveraged artificial intelligence technologies in the battle against COVID-19. It is also available on Amazon KindleAlong with this report, we also introduced a database covering additional 1428 artificial intelligence solutions from 12 pandemic scenarios.

Click here to find more reports from us.

AI Weekly.png

We know you don’t want to miss any news or research breakthroughs. Subscribe to our popular newsletter Synced Global AI Weekly to get weekly AI updates.

4 comments on “Amazon Alexa AI’s ‘Language Model Is All You Need’ Explores NLU as QA

%d bloggers like this: