Tag: Deep Neural Networks

Machine Learning & Data Science Nature Language Tech Popular

Google Approaches BERT-Level Performance Using 300x Fewer Parameters with Extension of Its New NLP model PRADO

The trimmed-down pQRNN extension to Google AI’s projection attention neural network PRADO compares to BERT on text classification tasks for on-device use.

AI Research

BatchNorm + Dropout = DNN Success!

A group of researchers from Tencent Technology, the Chinese University of Hong Kong, and Nankai University recently combined two commonly used techniques — Batch Normalization (BatchNorm) and Dropout — into an Independent Component (IC) layer inserted before each weight layer to make inputs more independent*.

AI Research

Global Minima Solution for Neural Networks?

New research from Carnegie Mellon University, Peking University and the Massachusetts Institute of Technology shows that global minima of deep neural networks can been achieved via gradient descent under certain conditions. The paper Gradient Descent Finds Global Minima of Deep Neural Networks was published November 12 on arXiv.