In conjunction with yesterday’s release of open source AI software framework PyTorch 1.0, leading deep learning course developer Fast.ai has announced its first open source library for deep learning — fastai v1.
Fastai is built on top of PyTorch, and provides “a single consistent API” to various common deep learning applications and data types.
Most deep learning libraries on the market today require professional knowledge and the writing of different code to call different APIs depending on the kinds of applications. With fastai, the same API can be called to run various tasks and data types including vision, text, tabular data, etc.
Fastai was initially announced in September 2017. An alpha version was made available to some early users before today’s release.
On a recent fast.ai blog post, OpenAI Research Fellow Christine McLeavey Payne shared her research on a neural net music generator named Clara. Payne says it took her only two weeks to build the music generator and get initial results using fastai. She tweeted, “I took a fastai Language Model almost exactly (very slight changes in sampling the generation) and experimented with ways to write out the music in either a ‘notewise’ or ‘chordwise’ encoding.’”
Another impressive fastai use case is architect and investor Miguel Pérez Michaus, who used the alpha version for his Style Reversion experiments. “Such a powerful tool with such a simple idea,” says Michaus, whose results suggest that even those with no formal computer science background can use the tool for deep learning applications.

Fast.ai used Kaggle’s Dogs vs Cats competition to demonstrate that the research breakthroughs were embedded in the library. The results showed that only 5 lines of fastai code could realize a task that required 31 lines in Keras — one of the most popular high-level API for building and training deep learning models.
fastai resnet34* | fastai resnet50 | Keras | |
Lines of code (excluding imports) | 5 | 5 | 31 |
Stage 1 error | 0.70% | 0.65% | 2.05% |
Stage 2 error | 0.50% | 0.50% | 0.80% |
Test time augmentation (TTA) error | 0.30% | 0.40% | N/A* |
Stage 1 time | 4:56 am | 9:30 am | 8:30 am |
Stage 2 time | 6:44 am | 12:48 pm | 5:38 pm |
Fast.ai Founder Jeremy Howard emphasized that they had used Keras in the comparison not to trounce it, but to show their admiration for “the strongest benchmark” they know of.
Tesla Director of AI Andrej Karpathy spoke highly of fastai at the PyTorch conference: “There’s nothing better out there.” Howard returned the goodwill in a Tweet praising Karpathy’s Stanford CS231n course notes.
Fastai download instructions are on their github page. The library can be installed via conda/pip, is also available on the Google Cloud Platform, and will soon come to AWS.
Author: Mos Zhang| Editor: Michael Sarazen
0 comments on “Fastai 1.0 for PyTorch: “There’s Nothing Better Out There””