2018 AI Index Report: AI Job Openings Surge 35X Since 2015
To help interested parties stay abreast of all that’s happening in AI, a team of researchers and experts from top institutions — including Stanford, MIT, OpenAI, SRI International and others — have published the 2018 AI Index Report, which delves into the data behind the tech to track its global growth.
(Synced) | Download the report here
NYU AI Now Report 2018
The past year has seen accelerated integration of powerful artificial intelligence systems into core social institutions, against a backdrop of rising inequality, political populism, and industry scandals. There have been major movements from both inside and outside technology companies pushing for greater accountability and justice.
Download the report here
2018 in Review: 10 AI Failures
AI has achieved remarkable progress, and many scientists dream of creating the Master Algorithm proposed by Pedro Domingos — which can solve all problems envisioned by humans. It’s unavoidable however that researchers, fledgling technologies and biased data will also produce blunders not envisioned by humans.
The 18 Biggest Computer Science Stories of 2018
Now while “juul” might not be a programming language decades from now, over the last year there’s been plenty of words that took on new meanings. From the massive Intel security flaw “Meltdown” to Google’s “Duplex” AI, there’s been no shortage of big stories in computer science this year. In no particular order, here are 18 of the highlights from 2018.
GAN 2.0: NVIDIA’s Hyperrealistic Face Generator
The NVIDIA paper proposes an alternative generator architecture for GAN that draws insights from style transfer techniques. The system can learn and separate different aspects of an image unsupervised; and enables intuitive, scale-specific control of the synthesis.
Open-sourcing PyText for Faster NLP Development
To make it easier to build and deploy natural language processing (NLP) systems, we are open-sourcing PyText, a modeling framework that blurs the boundaries between experimentation and large-scale deployment. PyText is a library built on PyTorch, our unified, open source deep learning framework.
How AI Training Scales
OpenAI discovered that the gradient noise scale, a simple statistical metric, predicts the parallelizability of neural network training on a wide range of tasks. Since complex tasks tend to have noisier gradients, increasingly large batch sizes are likely to become useful in the future, removing one potential limit to further growth of AI systems.
Deep Learning for Classical Japanese Literature
In this work, the research team introduces Kuzushiji-MNIST, a dataset which focuses on Kuzushiji (cursive Japanese), as well as two larger, more challenging datasets, Kuzushiji-49 and Kuzushiji-Kanji. Through these datasets, we wish to engage the machine learning community into the world of classical Japanese literature.
You May Also Like
A Peek Inside Andrew Ng’s “AI Transformation Playbook”
Andrew Ng is touting his new, free, AI Transformation Playbook. The 12 page online document is a unique package of artificial intelligence tricks aimed at the increasing number of CEOs who want to transform their companies by introducing AI technologies. The AI mastermind and founder of Landing.ai released the playbook today.
Alibaba Healthcare AI Targets Macau’s Flu Season
Alibaba has trained machine learning models with data on disease trends and historical flu analysis to predict the risk of outbreak intensity and disease transmission over the next two weeks. It’s hoped that informing citizens and local health bureaus will help with both preparation and prevention.
Global AI Events
January 24–25, 2019 The AI Assistant Summit. San Francisco, United States
Jan 27 – Feb 1, 2019 AAAI 2019: Association for the Advancement of Artificial Intelligence. Hawaii, United State
Global AI Opportunities