Category: Research

Technical review of the newest machine intelligence research.

AI Machine Learning & Data Science Research

Google & Northwestern U Present Provably Efficient Learning Algorithms for Neural Networks

A research team from Google Research and Northwestern University presents polynomial time and sample-efficient algorithms for learning an unknown depth-2 feedforward neural network with general ReLU activations, aiming to provide insights into whether efficient algorithms exist for learning ReLU networks.

AI Machine Learning & Data Science Nature Language Tech Research

Melbourne U, Facebook & Twitter Expose Novel Numerical Errors in NMT Systems

A research team from the University of Melbourne, Facebook AI, and Twitter Cortex proposes a black-box test method for assessing and debugging the numerical translation of neural machine translation systems in a systematic manner. The approach reveals novel types of errors that are general across multiple state-of-the-art translation systems.

AI Machine Learning & Data Science Research

Baidu’s Knowledge-Enhanced ERNIE 3.0 Pretraining Framework Delivers SOTA NLP Results, Surpasses Human Performance on the SuperGLUE Benchmark

A research team from Baidu proposes ERNIE 3.0, a unified framework for pretraining large-scale, knowledge-enhanced models that can easily be tailored for both natural language understanding and generation tasks with zero-shot learning, few-shot learning or fine-tuning, and achieves state-of-the-art results on NLP tasks.

AI Machine Learning & Data Science Research

New Study Proposes Quantum Belief Function, Achieves Exponential Time Acceleration

A research team from the University of Electronic Science and Technology of China, Chinese Academy of Sciences, School of Education Shaanxi Normal University, Japan Advanced Institute of Science and Technology and ETH Zurich encodes the basic belief assignment (BBA) into quantum states and implements them on a quantum circuit, aiming to utilize quantum computation characteristics to better handle belief functions.

AI Machine Learning & Data Science Research

Two Lines of Code to Use a 2080Ti to Achieve What Was Previously Only Possible on a V100

As the dynamic computational graph is widely supported by many machine learning frameworks, GPU memory utilization for training on a dynamic computational graph becomes a key specification of these frameworks. In the recently released v1.4, MegEngine provides a way to reduce the GPU memory usage by additional computation using Dynamic Tensor Rematerialization (DTR) technique and further engineering optimization, which makes large batch size training on a single GPU possible.

AI Computer Vision & Graphics Machine Learning & Data Science Popular Research

Facebook & UC Berkeley Substitute a Convolutional Stem to Dramatically Boost Vision Transformers’ Optimization Stability

A research team from Facebook AI and UC Berkeley finds a solution for vision transformers’ optimization instability problem by simply using a standard, lightweight convolutional stem for ViT models. The approach dramatically increases optimizer stability and improves peak performance without sacrificing computation efficiency.

AI Computer Vision & Graphics Machine Learning & Data Science Research

Video Swin Transformer Improves Speed-Accuracy Trade-offs, Achieves SOTA Results on Video Recognition Benchmarks

A research team from Microsoft Research Asia, University of Science and Technology of China, Huazhong University of Science and Technology, and Tsinghua University takes advantage of the inherent spatiotemporal locality of videos to present a pure-transformer backbone architecture for video recognition that leads to a better speed-accuracy trade-off.

AI Machine Learning & Data Science Research

New Milestone for Deep Potential Application: Predicting the Phase Diagram of Water

A research team from Princeton University, the Institute of Applied Physics and Computational Mathematics and the Beijing Institute of Big Data Research uses the Deep Potential (DP) method to predict the phase diagram of water from ab initio quantum theory, from low temperature and pressure to about 2400 K and 50 GPa. The paper was published in leading physics journal Physical Review Letters and represents an important milestone in the application of DP.

AI Machine Learning & Data Science Nature Language Tech Research

Google Researchers Merge Pretrained Teacher LMs Into a Single Multilingual Student LM Via Knowledge Distillation

A Google Research team proposes MergeDistill, a framework for merging pretrained teacher LMs from multiple monolingual/multilingual LMs into a single multilingual task-agnostic student LM to leverage the capabilities of the powerful language-specific LMs while still being multilingual and enabling positive language transfer.

AI Machine Learning & Data Science Research

Pieter Abbeel Team’s Decision Transformer Abstracts RL as Sequence Modelling

A research team from UC Berkeley, Facebook AI Research and Google Brain abstracts Reinforcement Learning (RL) as a sequence modelling problem. Their proposed Decision Transformer simply outputs optimal actions by leveraging a causally masked transformer, yet matches or exceeds state-of-the-art model-free offline RL baselines on Atari, OpenAI Gym, and Key-to-Door tasks.

AI Machine Learning & Data Science Research

What Matters in Adversarial Imitation Learning? Google Brain Study Reveals Valuable Insights

A research team from Google Brain conducts a comprehensive empirical study on more than fifty choices in a generic adversarial imitation learning framework and explores their impacts on large-scale (>500k trained agents) continuous-control tasks to provide practical insights and recommendations for designing novel and effective AIL algorithms.