Amazon Alexa AI’s ‘Language Model Is All You Need’ Explores NLU as QA
Amazon Alexa AI paper asks whether NLU problems could be mapped to question-answering (QA) problems using transfer learning.
AI Technology & Industry Review
Amazon Alexa AI paper asks whether NLU problems could be mapped to question-answering (QA) problems using transfer learning.
Google recently introduced mT5, a multilingual variant of its “Text-to-Text Transfer Transformer” (T5), pretrained on a new Common Crawl-based dataset covering 101 languages.
Facebook AI researchers and engineers just made live video content more accessible by enabling automatic closed captions for Facebook Live and Workplace Live.
XTREME, a multi-task benchmark that evaluates cross-lingual generalization capabilities of multilingual representations across 40 languages and nine tasks.
A group of researchers from The Katholieke Universiteit Leuven and The Technical University of Berlin recently introduced a Dutch RoBERTa-based language model, RobBERT.
Google has now released a major V2 ALBERT update and open-sourced Chinese ALBERT models.
Researchers from Beijing University of Posts and Telecommunication have introduced a novel visual dialogue state tracking (VDST) model that’s got pretty good at the similar, visual dialogue guessing game “GuessWhat?!”
Due to the nuanced character choices and other unique literal and aesthetical characteristics, automatic generation of Chinese poetry is challenging for AI, and high-quality poems can hardly be generated by end-to-end methods.
The recent rapid development of pretrained language models has produced significant performance improvements on downstream NLP tasks.
Carnegie Mellon University researchers have made another leap in the field with their Joint Language-to-Pose (JL2P) model, which generate animations from text input via a joint multimodal space comprising language and poses.
A team of researchers from Carnegie Mellon University and Google Brain have now proposed XLNet, a new language model which outperforms BERT on 20 language tasks including SQuAD, GLUE, and RACE; and has achieved SOTA results on 18 of these tasks.
Baidu has released ERNIE (Enhanced Representation through kNowledge IntEgration), a new knowledge integration language representation model which outperforms Google’s state-of-the-art BERT (Bidirectional Encoder Representations from Transformers) in Chinese language tasks.
This paper is the implementation of ‘encoder-decoder reconstructor framework’ for neural machine translation for the English-Japanese translation task.
Computer science and machine learning scholars are increasingly interested in languages, and natural language processing has shown great progress as a result.
This talk was recently given by Prof. Christopher Manning at Simons Institute, UC Berkeley. It is an introductory tutorial without complicated algorithms.