Ten years ago, Geoffrey Hinton and his University of Toronto students published the paper ImageNet Classification with Deep Convolutional Neural Networks, presenting the first convolutional neural network to significantly surpass state-of-the-art results on the ImageNet database. The paper was recently honoured, in its first year of eligibility, with the NeurIPS 2022 Test of Time Award. In his NeurIPS keynote speech last week, Hinton offered his thoughts on the future of machine learning — focusing on what he has dubbed the “Forward-Forward” (FF) algorithm.
Deep neural networks that perform stochastic gradient descent with huge parameter counts and massive data have achieved stunning triumphs over the past decade. The gradients of such models are typically computed using backpropagation, a technique Hinton helped pioneer. But there is increasing interest in whether the biological brain follows backpropagation or, as Hinton asks, whether it has some other way of getting the gradients needed to adjust the weights on its connections. In this regard, Hinton proposes the FF algorithm as an alternative to backpropagation for neural network learning.

The FF algorithm is inspired by Boltzmann machines (Hinton and Sejnowski, 1986) and Noise Contrastive Estimation (Gutmann and Hyvärinen, 2010). It aims to replace the forward and backward passes of backpropagation with two forward passes: a positive pass that operates on real data and adjusts weights “to improve the goodness in every hidden layer,” and a negative pass that operates on externally supplied or model-generated “negative data” and adjusts weights to deteriorate the goodness. The objective function for each network layer is to have high goodness for positive data and low goodness for negative data.
Hinton posits that the FF algorithm can better explain the cortical learning process of the brain and emulate hardware with a lower energy consumption. He also advocates abandoning the hardware-software separation paradigm in computer science, suggesting that future computers be designed and built as “non-permanent” or “mortal” to save computational resources, and that the FF algorithm is the best-equipped learning method for such hardware.


In an empirical study, the FF algorithm achieved a 1.4 percent test error rate on the MNIST dataset without using complicated regularizers, demonstrating that it works as well as backpropagation. The FF approach also delivered results competitive with backpropagation on the CIFAR-10 dataset.
Hinton suggests the proposed FF algorithm combined with a mortal computing model could one day enable running trillion-parameter neural networks on only a few watts of power. Although he turned 75 this month, this ambitious new research shows that the Turing Award winner is not resting on his laurels.
The paper The Forward-Forward Algorithm: Some Preliminary Investigations is available on the University of Toronto website.
Author: Hecate He | Editor: Michael Sarazen, Chain Zhang

We know you don’t want to miss any news or research breakthroughs. Subscribe to our popular newsletter Synced Global AI Weekly to get weekly AI updates.
Pingback: Forward-Forward Algorithm, an Alternative to Backpropagation for Neural Networks - Smashapk
Pingback: Tech roundup 172: a journal published by a bot - Javi López G.
Wtf is “mortal”? At least due some attempt at explaining what it is and why is better (per Hintons) vs current approach to hw sw codesign
gooooooooooood
Great to read!
This article is an appealing wealth of informative data that is interesting and well-written GTU
Hello, I have browsed most of your posts. This post is probably where I got the most useful information for my research. Thanks for posting, maybe we can see more on this. Are you aware of any other websites on this subject
http://virtuelcampus.univ-msila.dz/factech/
شكرا
In contrast to backpropagation, the FF algorithm only requires simple regularizers to reach a test error rate of 1.4% on the MNIST dataset, as shown by an empirical research.
Ukraine is currently at war and thousands of innocent people are dying, who simply lived in their own country and did their usual things. Russia has declared war on Ukraine and is killing civilians. To read true information or help Ukraine go to the site Comeback Alive
This fund provides assistance to the Ukrainian military and brings Ukraine’s victory over Russia closer.
nice job guys your blog is so good