In a joint effort with the Perimeter Institute for Theoretical Physics and Alphabet (Google) X, Google AI researchers recently announced a new open source library, TensorNetwork, which can greatly improve the efficiency of tensor calculations for tensor network algorithms.
Facebook AI Research has announced it is open-sourcing PyTorch-BigGraph (PBG), a tool that can easily process and produce graph embeddings for extremely large graphs. PBG can also process multi-relation graph embeddings where a model is too large to fit in memory.
Google yesterday announced a new program, Seasons of Docs, that aims to make a substantive contribution to open source software development. The eight-month project will assemble a team of technical writers to work on improving documentation development for various open source projects.
Natural language processing has made significant progress in the past year, but few frameworks focus directly on NLP or sequence modeling. Google Brain recently released Lingvo, a deep learning framework based on TensorFlow. Synced invited Ni Lao, Chief Science Officer at Mosaix, to share his thoughts on Lingvo.
Facebook AI Research (FAIR) introduced their own Go bot last year, aiming to reproduce AlphaGo Zero results using their Extensible, Lightweight Framework (ELF) for reinforcement learning research. FAIR recently added new features to ELF OpenGo and has open-sourced the project.
The San Francisco-based AI non-profit however has raised eyebrows in the research community with its unusual decision to not release the language model’s code and training dataset. In a statement sent to Synced, OpenAI explained the choice was made to prevent malicious use: “it’s clear that the ability to generate synthetic text that is conditioned on specific subjects has the potential for significant abuse.”
Last December some 9,000 attendees packed a single venue in Montreal for a week-long academic conference. NeurIPS was completely sold out, the latest indication of just how hot AI is nowadays. As AI and machine learning continue to ignite discussion across a wide variety of disciplines, novel approaches to the tech are also garnering interest.
Alibaba Cloud recently announced that it has open sourced Mars — its tensor-based framework for large-scale data computation — on Github. Mars can be regarded as “a parallel and distributed NumPy.” Mars can tile a large tensor into small chunks and describe the inner computation with a directed graph, enabling the running of parallel computation on a wide range of distributed environments, from a single machine to a cluster comprising thousands of machines.
Tencent AI Lab has announced an open-source NLP dataset comprising vector representations for eight million Chinese words and phrases. The dataset aims to provide large-scale and high-quality support for deep learning-based Chinese language NLP research in both academic and industrial applications.
DeepMind announced today that it has opened its Graph Nets (GN) library to the public, enabling the use of graph networks in TensorFlow and Sonnet. Graph Nets is a machine learning framework that was published by DeepMind, Google Brain, MIT and University of Edinburgh on Jun 15.
The DeepMimic paper’s first author, Berkeley PhD student Xue Bin Peng, has now open-sourced the project’s codes, data, and frameworks. Moreover, Peng’s new research demonstrates that DeepMimic’s simulated characters can also learn to perform highly dynamic movements by using regular video clips of human examples as input data.