Originally designed for simulating quantum physics, tensor networks are now increasingly applied for solving machine learning tasks such as image recognition. However there are currently no production-level tensor network libraries available for running tensor network algorithms on a large scale; and most published tensor network research still focuses on physics applications.
In a joint effort with the Perimeter Institute for Theoretical Physics and Alphabet (Google) X, Google AI researchers recently announced a new open source library, TensorNetwork, which can greatly improve the efficiency of tensor calculations for tensor network algorithms. TensorNetwork uses TensorFlow as the backend and is optimized for GPU processing. Researchers say that a GPU and the TensorNetwork library can achieve computation speeds up to 100x better than a CPU when training a tree tensor network (TTN) to find the ground state energy of a physical system.
Tensors are generalizations of matrices that generalize scalars, vectors and matrices to higher dimensions; and Tensor networks are graphically encoded tensor contraction patterns.
Tensor networks do not directly store or manipulate tensors, but represents them as contractions of smaller tensors in the shape of the more extensive tensor network. Tensor networks can therefor represent several, dozens, or even hundreds of tensors very efficiently without requiring a large amount of memory. These advantages make tensor networks practical for tasks like image classification and object recognition.
Researchers Stoudenmire and Schwab demonstrate how image data is encoded in tensor networks in the paper Supervised Learning With Quantum-Inspired Tensor Networks. Instead of turning a high-dimensional vector into an order-N tensor and adding up all of the tensors, researchers represent T (a total tensor Ti1,i2,…,iN encapsulating the image collection) as the contraction of numerous smaller constituent tensors in a tensor network, which achieves much better efficiency. The TensorNetwork library was built to promote precisely these kinds of tasks.
Tensor networks are now being widely used in quantum physics. As a general-purpose library for tensor network algorithms, TensorNetwork is well-suited for physics scenarios. Approximating quantum states is a traditional application for tensor networks in the field of physics, and this practice can be used to examine the capabilities and efficiency of the TensorNetwork library.
Google researchers successfully used TensorNetwork to implement a tree tensor network (TTN) algorithm for approximating the ground state of a periodic quantum spin chain or a lattice model on a thin torus.
Google has open-sourced TensorNetwork on Github and released two related papers. TensorNetwork: A Library for Physics and Machine Learning introduces TensorNetwork and its API, and provides background on tensor networks for readers without a physics background. In the other paper, TensorNetwork on TensorFlow: A Spin Chain Application Using Tree Tensor Networks, Google researchers show TensorNetwork’s implementation in physics cases, and explain how to use this tool to accelerate GPUs.
Google says the next step will be using TensorNetwork to conduct images classification tasks for the MNIST and Fashion-MNIST datasets. Researchers expect new features will be added to TensorNetwork through the open source community: “We hope that TensorNetwork will become a valuable tool for physicists and machine learning practitioners.”
Author: Herin Zhao | Editor: Michael Sarazen
0 comments on “Google TensorNetwork Library Dramatically Accelerates ML & Physics Tasks”