Tag: Deep Neural Networks

AI Machine Learning & Data Science Research

BENDR for BCI: UToronto’s BERT-Inspired DNN Training Approach Learns From Unlabelled EEG Data

University of Toronto researchers propose a BERT-inspired training approach as a self-supervised pretraining step to enable deep neural networks to leverage newly and publicly available massive EEG (electroencephalography) datasets for downstream brain-computer-interface (BCI) applications.

AI Research

BatchNorm + Dropout = DNN Success!

A group of researchers from Tencent Technology, the Chinese University of Hong Kong, and Nankai University recently combined two commonly used techniques — Batch Normalization (BatchNorm) and Dropout — into an Independent Component (IC) layer inserted before each weight layer to make inputs more independent*.

AI Research

Global Minima Solution for Neural Networks?

New research from Carnegie Mellon University, Peking University and the Massachusetts Institute of Technology shows that global minima of deep neural networks can been achieved via gradient descent under certain conditions. The paper Gradient Descent Finds Global Minima of Deep Neural Networks was published November 12 on arXiv.