Tag: Deep Neural Networks

AI Research

Peeking Inside DNNs With Information Theory

Deep learning model performance has taken huge strides, allowing researchers to tackle tasks which were simply not possible for machines less than a decade ago.

AI Research

BatchNorm + Dropout = DNN Success!

A group of researchers from Tencent Technology, the Chinese University of Hong Kong, and Nankai University recently combined two commonly used techniques — Batch Normalization (BatchNorm) and Dropout — into an Independent Component (IC) layer inserted before each weight layer to make inputs more independent*.

AI Research

Global Minima Solution for Neural Networks?

New research from Carnegie Mellon University, Peking University and the Massachusetts Institute of Technology shows that global minima of deep neural networks can been achieved via gradient descent under certain conditions. The paper Gradient Descent Finds Global Minima of Deep Neural Networks was published November 12 on arXiv.