A group of Google researchers led by Quoc Le — the AI expert behind Google Neural Machine Translation and AutoML — have published a paper proposing attention augmentation. In experiment results, the novel two-dimensional relative self-attention mechanismfor image classification delivers “consistent improvements in image classification.”
Tsinghua Natural Language Processing Group (THUNLP) has published a great reading list for any budding AI researchers whose New Year’s resolution is to study machine translation. The list compiles the most influential machine translation papers from the past 30 years, spotlighting the 10 most important contributions to the development of machine translation.
Deep Learning has become an essential toolbox which is used in a wide variety of applications, research labs, industries, etc. In this tutorial given at NIPS 2017, the speakers provide a set of guidelines which will help newcomers to the field understand the most recent and advanced models and their application to diverse data modalities.
Microsoft researchers in the US and Asia sent a shockwave through the AI community today with their paper Achieving Human Parity on Automatic Chinese to English News Translation, which introduces a neural machine translation system they say equals the performance of human experts in Chinese-to-English translation.
To combine the advantages of these two methods, the authors of this paper first adapts the multi-source NMT model, by employing different encoders to capture the semantics of the source language, then the decoder is used to generate the final output by the multiple context vector representations coming from the encoder.