Tag: Neural Network

AI Technology

DeepMind AI Flunks High School Math Test

DeepMind trained and tested its neural model by first collecting a dataset consisting of different types of mathematics problems. Rather than crowd-sourcing, they synthesized the dataset to generate a larger number of training examples, control the difficulty level and reduce training time.

AI Interview

Google Brain Simplifies Network Learning Dynamics Characterization Under Gradient Descent

Machine learning models based on deep neural networks have achieved unprecedented performance on many tasks. These models are generally considered to be complex systems and difficult to analyze theoretically. Also, since it’s usually a high-dimensional non-convex loss surface which governs the optimization process, it is very challenging to describe the gradient-based dynamics of these models during training.

AI

Yelp: A Neural Net Killed Our App… /jk

The announcement seemed too early for April Fools’, but a Yelp spokesperson confirmed with Synced that it was in fact a gag: “we typically use a joking tone when we write our release notes for the app store. Our latest note was meant in jest.”

AI Technology

GIF2Video Gives GIFs Realism

The internet loves those little looping action images we call GIFs. They can tell a short visual story in a small file size that has high portability. The visual quality of GIFs is however usually low compared to the videos they were sourced from. If you are sick of fuzzy, low resolution GIFs, then researchers from Stony Brook University, UCLA, and Megvii Research have just the thing for you: “the first learning-based method for enhancing the visual quality of GIFs in the wild.”

AI Technology

Jeff Dean’s 1990 Senior Thesis Is Better Than Yours

Google AI lead Jeff Dean recently posted a link to his 1990 senior thesis on Twitter, which set off a wave of nostalgia for the early days of machine learning in the AI community. Parallel Implementation of Neural Network Training: Two Back-Propagation Approaches may be almost 30 years old and only eight pages long, but the paper does a remarkable job of explaining the methods behind neural network training and the modern development of artificial intelligence.

AI Technology

SJTU & MIT Paper Reinvents Neural Architecture Search; Slashes Computational Resource Requirements

The dearth of AI talents capable of manually designing neural architecture such as AlexNet and ResNet has spurred research in automatic architecture design. Google’s Cloud AutoML is an example of a system that enables developers with limited machine learning expertise to train high quality models. The trade-off, however, is AutoML’s high computational costs.