AI Industry

Facebook Debuts PyTorch 1.3 With PyTorch Mobile, Quantization, TPU Support and More

The latest version, PyTorch 1.3, includes PyTorch Mobile, quantization, and Google Cloud TPU support. The release was announced today at the PyTorch Developer Conference in San Francisco.

Facebook has updated its popular open-source deep-learning library PyTorch. The latest version, PyTorch 1.3, includes PyTorch Mobile, quantization, and Google Cloud TPU support. The release was announced today at the PyTorch Developer Conference in San Francisco.

PyTorch Mobile enables an end-to-end workflow from Python to deployment on iOS and Android. Facebook believes it is increasingly important to be able to run machine learning models on devices such as today’s supercharged smartphones, as this delivers lower latency and can help preserve data privacy for example through federated learning approaches.

Currently, PyTorch Mobile is in the experimental mode and remains under examination and development. Facebook AI hopes to optimize PyTorch Mobile for size, performance, and high-level API. In computer vision and natural language processing for example the mobile native APIs will be extended to carry out common preprocessing and integration tasks required in mobile applications.

Another experimental feature of PyTorch 1.3 is Quantization. Since neural network inference is expensive and IoT and mobile devices have limited resources, it is vital to ensure the efficient usage of server-side and on-device compute resources. Techniques such as 8-bit model quantization can perform computation two to four times faster with one-quarter the memory usage.

Facebook also announced the general availability of Google Cloud TPU support for PyTorch 1.3. Google’s custom silicon chips Tensor Processing Units (TPUs) have been widely deployed to accelerate the largest-scale machine learning applications. At the same time, the PyTorch support for Cloud TPUs is available in Colab. Another key participant in the cloud provider and hardware ecosystem, Chinese e-commerce powerhouse Alibaba, also added support for PyTorch in Alibaba Cloud to enable more extensive usage of PyTorch in the development of AI applications.

Facebook today also introduced the Captum and CrypTen tools, which are for explaining machine learning models and machine learning model interpretability and privacy.

According to an O’Reilly report, PyTorch arXiv citations grew 194 percent in the first six months of 2019. The open-source deep-learning library has almost 1,200 contributors helping develop the project, and 22,000 forum users.


Journalist: Fangyu Cai | Editor: Michael Sarazen

1 comment on “Facebook Debuts PyTorch 1.3 With PyTorch Mobile, Quantization, TPU Support and More

  1. Pingback: Facebook Debuts PyTorch 1.3 With PyTorch Mobile, Quantization, TPU Support and More - PC Solution

Leave a Reply

Your email address will not be published. Required fields are marked *