AI Research

PyTorch 1.2 Supports Transformer and Tensorboard; Summer Hackathon Announced

PyTorch v1.2 brings the machine learning community further improvements, including official support for Transformer, TensorBoard, and more.

Since Facebook launched PyTorch in early 2017, the open-source machine learning framework has become wildly popular. Its flexible, dynamic programming environment and user-friendly interface make PyTorch ideal for quick experimentation, and the Facebook-developed and maintained framework continues to grow within the developer community.

The official release of PyTorch 1.0 in December 2018 solved a range of issues including reusability, performance, programming language and scalability. Now, PyTorch v1.2 brings the machine learning community further improvements, including official support for Transformer, TensorBoard, and more.

Based on the paper Attention is All You Need, PyTorch v1.2 incorporates the standard nn.Transformer module, which describes global dependencies between input and output by relying entirely on the attention mechanism. The nn.Transformer module components are designed individually and can be used separately. For example, the nn.Transformer Encoder can operate without the larger nn.Transformer.

PyTorch 1.2 now officially supports TensorBoard, a function which is no longer experimental and can be activated by simply typing “from torch.utils.tensorboard import SummaryWriter.

In a joint effort with Microsoft, PyTorch 1.2 fully supports exporting the ONNX Opset versions 7 (V1.2), 8 (v1.3), 9 (v1.4), and 10 (v1.5). PyTorch 1.2 also enhances the constant folding pass (a process for simplifying the expression of constants at compile time) to support the latest available version of ONNX, Opset 10. Users can now register their own ‘symbolic’ to procure custom actions while specifying the dynamic dimensions of the input when exporting.

Below is a summary of major improvements for ONNX:

  • Support for multiple Opsets including the ability to export dropout, slice, flip, and interpolate in Opset 10.
  • Improvements to ScriptModule including support for multiple outputs, tensor factories, and tuples as inputs and outputs.
  • More than a dozen additional PyTorch operators supported including the ability to export a custom operator.
  • Many big fixes and test infrastructure improvements.

The latest PyTorch-to-ONNX tutorial can be foundhere.

PyTorch 1.2 also provides a new, easy-to-use TorchScript API for converting nn.Modules to TorchScript. The new TorchScript has significantly improved its support for the Python language structure and the Python standard library. The announced new releases also include torchvision 0.4, torchaudio 0.3, and torchtext 0.4.

  • Torchvision 0.4 With Support For Video
  • Torchaudio 0.3 With Kaldi Compatibility, New Transforms
  • Torchtext 0.4 With Supervised Learning Datasets

All these releases can be found at pytorch.org.

PyTorch 1.2 | Global Summer Hackathon

Facebook also announced an online Global PyTorch Summer Hackathon, which aims to collect creative and well-implemented solutions that use PyTorch to make positive impacts on businesses and society. The solution submissions period is now open and runs to September 16. The Hackathon features over US$60,000 in cash prizes, and winners will also get the opportunity to showcase their projects at the PyTorch Developer Conference on October 10, 2019.

For more details and to register, visit the Summer Hackathon page on the PyTorch website.


Author: Herin Zhao | Editor: Michael Sarazen

3 comments on “PyTorch 1.2 Supports Transformer and Tensorboard; Summer Hackathon Announced

  1. Pingback: PyTorch 1.2 Supports Transformer and Tensorboard; Summer Hackathon Announced – Synced – RushDar

  2. Great goods from you, man. I’ve understand your stuff previous to
    and you are just extremely excellent. I really like what you’ve acquired here, certainly like what you’re stating and the way
    in which you say it. You make it entertaining
    and you still take care of to keep it smart. I cant wait to read much
    more from you. This is actually a terrific website.

  3. It’s a shame you don’t have a donate button! I’d definitely
    donate to this excellent blog! I suppose for now i’ll settle for book-marking and adding your RSS feed
    to my Google account. I look forward to fresh updates and will share
    this blog with my Facebook group. Chat soon!

Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: