AI Machine Learning & Data Science Popular Research

Cornell & NTT’s Physical Neural Networks: a “Radical Alternative for Implementing Deep Neural Networks” That Enables Arbitrary Physical Systems Training

A team from Cornell University and NTT Research proposes Physical Neural Networks (PNNs), a universal framework that leverages a backpropagation algorithm to train arbitrary, real physical systems to execute deep neural networks.

Deep neural networks (DNNs) already provide the best solutions for many complex problems in image recognition, speech recognition, and natural language processing. Now, DNNs are entering the physical arena. DNNs and physical processes share numerous structural similarities, such as hierarchy, approximate symmetries, redundancy and nonlinearity, suggesting the potential for DNNs to operate effectively on data from the physical world.

In the paper Deep Physical Neural Networks Enabled by a Backpropagation Algorithm for Arbitrary Physical Systems, a research team from Cornell University and NTT Research proposes that the controlled evolutions of physical systems are well-suited to the realization of deep learning models, and introduces Physical Neural Networks (PNN), a novel framework that leverages a backpropagation algorithm to train arbitrary, real physical systems to execute deep neural networks.

image.png

The principle behind backpropagation algorithms is the modelling of mathematical operations by modifying input signal weights to produce an expected output signal. Determining the optimal parameter updates makes it possible to improve model performance by computing the gradient descend.

The proposed PNN framework is enabled by a Physics-Aware Training (PAT) approach based on a novel hybrid physical-digital algorithm that can execute the backpropagation algorithm efficiently and accurately on any sequence of physical input-output transformations. Essentially this means a problem is solved by applying backpropagation algorithms to train sequences of real physical operations to perform desired physical functions.

image.png

The PAT training process comprises five steps:

  1. Training input data is input to the physical system along with parameters.
  2. In a forward pass, the physical system applies its transformation to produce an output.
  3. The physical output is compared to the intended output to compute the error.
  4. Using a differentiable digital model to estimate the gradients of the physical system, the gradient of the loss is computed with respect to the controllable parameters.
  5. The parameters are updated according to the inferred gradient.

The process is repeated during training, iterating over training examples until the error is reduced to a pre-defined threshold.

image.png
image.png

The researchers evaluated PNNs’ generality using three diverse physical systems — optical, mechanical, and electrical.

In one experiment, the team tests a PNN that uses broadband optical second harmonic generation (SHG) with shaped femtosecond pulses. The PNN is tasked with learning to predict spoken vowels from 12-dimensional input data vectors of formant frequencies extracted from audio recordings, then classify the spoken vowels according to their formant frequencies. The results showed that the proposed SHG-PNN is able to perform classification of vowels to 93 percent accuracy.

On the MNIST handwritten digit classification task, the trainable SHG transformations boost the performance of digital operations from roughly 90 percent accuracy to 97 percent.

The team believes PNNs provide a basis for hardware-physics-software co-design in ML and have the potential to facilitate the development of novel ML hardware that is orders of magnitude faster and more energy-efficient than conventional electronic processors.

The paper Deep Physical Neural Networks Enabled by a Backpropagation Algorithm for Arbitrary Physical Systems is on arXiv.


Author: Hecate He | Editor: Michael Sarazen, Chain Zhang


We know you don’t want to miss any news or research breakthroughs. Subscribe to our popular newsletter Synced Global AI Weekly to get weekly AI updates.

5 comments on “Cornell & NTT’s Physical Neural Networks: a “Radical Alternative for Implementing Deep Neural Networks” That Enables Arbitrary Physical Systems Training

  1. Pingback: r/artificial - [R] Cornell & NTT’s Physical Neural Networks: a “Radical Alternative for Implementing Deep Neural Networks” That Enables Arbitrary Physical Systems Training - Cyber Bharat

  2. Pingback: New top story on Hacker News: Cornell and NTT’s Physical Neural Nets Enable Arbitrary Physical System Training – Welcome to world wide tech news

  3. Michael

    This looks like chip in the loop training method, popular with analog NN accelerators.

  4. It is not surprising that neural networks are beginning to be used in the physical realm. Our brains work much like a neuron, transmitting signals from one neuron to another and then transforming those signals through mathematical operations. This discovery may only be a matter of time before we see more neural networks being applied to everyday life!

  5. Pingback: Tech roundup 99: a journal published by a bot - Javi López G.

Leave a Reply to Michael Cancel reply

Your email address will not be published. Required fields are marked *