ACL 2020 Announces Best Paper & Test-Of-Time Awards
Organizers of the 58th Annual Meeting of the Association for Computational Linguistics (ACL) today announced their Best Paper Awards.
AI Technology & Industry Review
Organizers of the 58th Annual Meeting of the Association for Computational Linguistics (ACL) today announced their Best Paper Awards.
Researchers from the University of Washington, Salesforce Research and Allen Institute for Artificial Intelligence have introduced a graph-based method that retrieves reasoning paths to boost multi-hop open-domain question answering.
Large transformer-based language models trained on pixel sequences can generate coherent images without the use of labels.
In a recent Google AI team blog post, researchers report on recent efforts and progress in the field of language translation, especially with resource-poor languages.
DeepMind researchers have developed EATS, a generative model trained adversarially in an end-to-end manner that achieves performance comparable to SOTA models.
Researchers add syntactic biases to determine whether and where they can help BERT achieve better understanding.
Canadian education technology startup Korbit Technologies has introduced a personalized AI-powered learning experience that it says can help all students learn faster and better in a cost-effective way.
Google Research team proposes the automatic metric BLEURT which is based on the highly successful Google language model BERT.
Organizers of the 58th annual meeting of the Association for Computational Linguistics (ACL) on Sunday announced the list of accepted papers for the world-leading natural language processing (NLP) conference.
To deliver human-level voices to its platform’s billions of users while maintaining strict compute efficiency, Facebook AI researchers have deployed a new neural TTS system that works on CPU servers.
A team from the Allen Institute for Artificial Intelligence and the University of Washington this week introduced TLDR generation, a new automatic summarization task for scientific papers.
This is the first chatbot to blend a diverse set of conversational skills — including empathy, knowledge, and personality — together in one system.
How much is this going to cost? And what are the main factors affecting that price tag?
Researchers “posit that the universes of knowledge and experience available to NLP models can be defined by successively larger world scopes: from a single corpus to a fully embodied and social context.”
XTREME, a multi-task benchmark that evaluates cross-lingual generalization capabilities of multilingual representations across 40 languages and nine tasks.
Covid-Sanity, a web interface designed to navigate the flood of bioRxiv and medRxiv COVID-19 papers and make the research within more searchable and sortable.
Researchers from Bocconi University have prepared an online overview of the commonalities and differences between language-specific BERT models and mBERT.
Researchers propose a novel model compression approach to effectively compress BERT by progressive module replacing.
In an attempt to equip the TF-IDF-based retriever with a state-of-the-art neural reading comprehension model, researchers introduced a new graph-based recurrent retrieval approach.
Researchers have proposed a novel self-adversarial learning (SAL) paradigm for improving GANs’ performance in text generation.
A recent Google Brain paper looks into Google’s hugely successful transformer network — BERT — and how it represents linguistic information internally.
Deep learning models are getting larger and larger to meet the demand for better and better performance. Meanwhile, the timeContinue Reading
Now, DeepMind and University College London (UCL) have introduced a new deep network called MEMO which matches SOTA results on Facebook’s bAbI dataset for testing text understanding and reasoning, and is the first and only architecture capable of solving long sequence novel reasoning tasks.
One of a new breed of open-domain chatbots designed to engage in conversations across any topic, Meena’s free and natural conversational abilities are closing the gap on human performance.
Facebook AI researchers have further developed the BART model with the introduction of mBART.
A team of researchers from the Natural Language Processing Lab at the University of British Columbia in Canada have proposed AraNet, a deep learning toolkit designed for Arabic social media processing.
No matter whether we’re sharing our lives with beloved household pets or protecting wildlife in a remote location, wouldn’t it be wonderful if we could somehow lift the language barrier that has impeded interspecies communication for millennia?
A recent paper published by Microsoft researchers proposes a new vision-language pretrained model for image-text joint embedding, ImageBERT, which which achieves SOTA performance on both the MSCOCO and Flickr30k datasets.
A group of researchers from The Katholieke Universiteit Leuven and The Technical University of Berlin recently introduced a Dutch RoBERTa-based language model, RobBERT.
Researchers recently proposed a new machine learning method for worldbuilding based on content from LIGHT, a research environment open-sourced by Facebook comprising crowd-sourced game locations, characters, and objects, etc.
Google has now released a major V2 ALBERT update and open-sourced Chinese ALBERT models.