Site icon Synced

A Round Up Of NVIDIA’s GPU Technology Conference (GTC)

Subscribe to Synced Global AI Weekly


GTC 2019 | Highlights & Disappointments at NVIDIA’s Annual Conference
NVIDIA’s annual GPU Technology Conference (GTC) attracted some 9,000 developers, buyers and innovators to San Jose, California this week. CEO and Co-Founder Jensen Huang’s two-and-a-half hour keynote speech fused GPU-based innovations in domains ranging from graphic design to autonomous driving.
(Synced) 


GTC 2019 | Huang Kicks Off GTC, Focuses on NVIDIA Data Center Momentum, Blue Chip Partners
NVIDIA’s message was unmistakable as it kicked off the 10th annual GPU Technology Conference: it’s doubling-down on the data center. Founder and CEO Jensen Huang delivered a sweeping opening keynote at San Jose State University, describing the company’s progress accelerating the sprawling data centers that power the world’s most dynamic industries.
(NVIDIA) (GTC 2019 Keynote)


GTC 2019 | NVIDIA’s New GauGAN Transforms Sketches Into Realistic Images
GTC 2019 | New NVIDIA One-Stop AI Framework Accelerates Workflows by 50x
GTC 2019 | NVIDIA CEO Says No Rush on 7nm GPU; Company Clearing Its Crypto Chip Inventory
GTC 2019 | Toyota doubles down on Nvidia tech for self-driving cars
GTC 2019 | Nvidia’s T4 GPUs are coming to the AWS cloud

Technology

Coconet: The ML Model Behind Today’s Bach Doodle
Google celebrated J.S. Bach’s 334th birthday with the first AI-powered Google Doodle. They introduce Coconet, the machine learning model behind the Doodle. People can create their own melody, and the machine learning model will harmonize it in Bach’s style.
(Google AI)


Reducing The Need for Labeled Data in Generative Adversarial Networks
Generative adversarial networks (GANs) are a powerful class of deep generative models.The main idea behind GANs is to train two neural networks: the generator, which learns how to synthesise data (such as an image), and the discriminator, which learns how to distinguish real data from the ones synthesised by the generator.
(Google AI)


Implicit Generation And Generalization in Energy-Based Models 
In this work, researchers advocate for using continuous energy-based models (EBMs), represented as neural networks, for generative tasks and as a means for generalizable models. These models aim to learn an energy function E(x) that assigns low energy values to inputs x in the data distribution and high energy values to other inputs.
(MIT & OpenAI) 

You May Also Like

New Study Uses Machine Learning to Predict Sexual Orientation
A new dissertation from University of Pretoria Information Technology master student John Leuner has revisited the thorny question of whether machine learning methods can effectively detect sexual orientation.
(Synced)


AttoNets: Compact and Efficient DNNs Realized via Human-Machine Collaborative
It is no secret that deep neural networks (DNNs) can achieve state-of-the-art performance in a wide range of complicated tasks. DNN models such as BigGAN, BERT, and GPT 2.0 have proved the high potential of deep learning. Deploying DNNs on mobile devices, consumer devices, drones and vehicles however remains a bottleneck for researchers.
(Synced)

Global AI Events

March 25-26,  MIT Technology Review’s EmTech Digital in San Francisco, United States

March 25-28, Strata Data Conference in San Francisco, United States

April 9-11, Google Cloud Next in San Francisco, United States

April 11, Applied Machine Learning Conference in Charlottesville, United States

Global AI Opportunities

2019 Google AI Residency Program

Research Scientist, Google Brain Toronto

LANDING AI is recruiting

DeepMind Scholarship: Access to Science


Stay tight with AI! 
Subscribe to Synced Global AI Weekly

Exit mobile version