AI Emerging Company Machine Learning & Data Science Others Research

Qualcomm AI Maps DL to Quantum Computer via Quantum Field Theory

A team from Qualcomm AI proposes the direct mapping of a deep neural network onto an optical quantum computer through the language of quantum field theory, paving the way for the future development of novel quantum neural network architectures.

The development of increasingly complex and powerful architectures has enabled deep learning (DL) to scale to large, heterogeneous, complex and multiclass problems. In step with the stunning successes, however, DL training algorithms have become very computationally expensive. With Moore’s law faltering, the AI research community is seeking new solutions to solve this DL issue.

Fortunately, exciting possibilities are expected to open up due to the emergence of quantum computing devices that can overcome the technological and thermodynamical limits of classical computation. Research on the exploitation of quantum computing devices to carry out DL however remains in its nascent phase. Moreover, it is not yet clear how to map neural networks onto a quantum computer.

In the paper The Hintons in your Neural Network: a Quantum Field Theory View of Deep Learning, a team from Qualcomm AI proposes a direct mapping of a deep neural network onto an optical quantum computer through the language of quantum field theory, an approach that could pave the way for the future development of novel quantum neural network architectures.

The “Hintons” in the paper title is an homage to Geoffrey Hinton — the researchers coined the term to refer to “the elementary excitation of the quantum field from which optical quantum neural networks are made.”

image.png

The researchers begin by reviewing the frameworks of probabilistic numeric neural networks, which classify input signals with missing data using Gaussian processes (GP) to interpolate the signal and use GP representations to define the neural network.

Then they introduce a series of quantum operations that generalize the classical layers of a probabilistic numeric neural network. First, they show how to perform Bayesian inference with Gaussian states in such a way as to allow quantum entanglement to represent an agent’s uncertainty about discretization errors. In the next step, they show how to perform the quantum equivalent of a linear layer that acts on the quantum fields in the same way as a classical linear layer acts on a classical field.

Finally, the researchers explain how to embed classical neural networks in the quantum model. They interpret the resulting model as a semi-classical limit of the quantum model. Specifically, the proposed model uses elements (uncertainty relation for the covariance) of quantum mechanics and classical mechanics for the non-linearity (similar to classical non-linearities, a quantum non-linearity acts pointwise on the quantum fields).

image.png
Hierarchy of the neural networks considered

To test the semiclassical neural network’s performance, the researchers implemented it on an optical quantum computer, where they say it “performed as expected.”

image.png
High level depiction of the implementation of the model in quantum optical hardware. The input on the left includes observations y1, y3 of a signal at locations x1, x3 — while information at the intermediate value x2 is missing. Laser beams for all locations x1, x2, x3 are then prepared and quantum GP (QGP) inference used to create a posterior state. Finally, a series of linear and non-linear layers are applied until an observable is measured with a detector to get a class C for classifying the input signal.

The researchers summarize their main contributions as:

  1. Show how to use Gaussian states for Bayesian inference.
  2. Devise unitary operators that implement standard non-linearities.
  3. Present tractable limits of the quantum network.
  4. Discuss how to implement the proposed models on a quantum computer.

The team also proposes exciting possible future directions in this field, such as studying approximate solutions that get closer to the full quantum model, finding efficient ways to do quantum GP inference on quantum hardware, and developing further quantum non-linearities and a quantum formalism for classical models.

The paper The Hintons in your Neural Network: a Quantum Field Theory View of Deep Learning is on arXiv.


Author: Hecate He | Editor: Michael Sarazen


We know you don’t want to miss any news or research breakthroughs. Subscribe to our popular newsletter Synced Global AI Weekly to get weekly AI updates.

%d bloggers like this: