AI Research

Does Deep Learning Still Need Backpropagation?

Now, researchers from the Victoria University of Wellington School of Engineering and Computer Science have introduced the HSIC (Hilbert-Schemidt independence criterion) bottleneck as an alternative to backpropagation for finding good representations.

When training deep neural networks, the goal is to automatically discover good “internal representations.” One of the most widely accepted methods for this is backpropagation, which uses a gradient descent approach to adjust the neural network’s weights. Now, researchers from the Victoria University of Wellington School of Engineering and Computer Science have introduced the HSIC (Hilbert-Schemidt independence criterion) bottleneck as an alternative to backpropagation for finding good representations.

The new method has several distinct advantages. Instead of solving problems by using the chain rule as traditional backpropagation does, HSIC solves problems layer-by-layer, eliminating problematic vanishing and exploding gradient issues found in backpropagation. It also facilitates parallel processing for training layers, and as a result requires significantly fewer operations. Finally, the proposed method removes backward sweeps to eliminate the requirement for symmetric feedback.

Researchers presented two approaches from a HSIC-bottleneck trained network to produce usable classifications. The first approach is a standard feedforward network (above left), generating one-hot results that can be directly permuted to perform classification. The second approach is the σ-combined network (above right), in which researchers simply append a single layer as an aggregator to assemble all the hidden representations so that each is trained with a specific σ, with the need to provide all information at different scales σ to the post training.

The research team also conducted experiments on the MNIST, FashionMNIST, and CIFAR 10 datasets for classic classification problems. Some of the results are shown below:

In experiments comparing the ResNet post and ResNet backpropagation methods, the HSIC bottleneck provides a significant boost in performance, which opens the possibility of learning classification tasks at near-competitive accuracy but without the limitations of backpropagation.

While backpropagation still plays a core role in most AI research, leading AI scientists are exploring alternatives. Last year backpropagation pioneer Dr. Geoffrey Hinton suggested the research community should ditch the technique and focus on unsupervised learning instead: “I don’t think [backpropagation] is how the brain works. We clearly don’t need all the labeled data.”

The paper The HSIC Bottleneck: Deep Learning Without Back-Propagation is on arXiv.


Author: Hecate He | Editor: Michael Sarazen

1 comment on “Does Deep Learning Still Need Backpropagation?

  1. Pingback: Does Deep Learning Still Need Backpropagation? – Synced – IAM Network

Leave a Reply

Your email address will not be published. Required fields are marked *