AI AI Weekly Conference

ICML Announces Best Paper Awards; What to Expect at CVPR 2019

Synced Global AI Weekly June 16th

Subscribe to Synced Global AI Weekly

ICML 2019 | Google, ETH Zurich, MPI-IS, Cambridge & Share Best Paper Honours
ICML announced the recipients of the Best Paper Awards: Challenging Common Assumptions in the Unsupervised Learning of Disentangled Representations from Google Research, ETH Zurich, and Max Planck Institute for Intelligent Systems; and Rates of Convergence for Sparse Variational Gaussian Process Regressionfrom the University of Cambridge and

ICML 2019 | Good Papers Collection
Google at ICML 2019
Facebook Research at ICML 2019
Intel AI Research at ICML 2019
Google Brain | Similarity of Neural Network Representations Revisited
Georgia Institute of Technology & Ant Financial |Generative Adversarial User Model for Reinforcement Learning Based Recommendation System
CMU & UC Berkeley | Learning Correspondence from the Cycle-consistency of Time
BAIR | Learning to Learn with Probabilistic Task Embeddings

What to Expect at CVPR 2019
Microsoft at CVPR 2019
IBM Research AI at CVPR 2019
Baidu at CVPR 2019
NVIDIA Research at CVPR 2019
Intel AI Research at CVPR 2019
Facebook AI | Creating 2.5D Visual Sound for An Immersive Audio Experience
2019 CVPR Accepted Papers Organized in A Parsable and Easier to Sort Through Way


What Does BERT Look At? An Analysis of BERT’s Attention
Researchers propose methods for analyzing the attention mechanisms of pre-trained models and apply them to BERT. BERT’s attention heads exhibit patterns such as attending to delimiter tokens, specific positional offsets, or broadly attending over the whole sentence, with heads in the same layer often exhibiting similar behaviors.
(Stanford University & Facebook AI Research)

Are Weights Really Important to Neural Networks?
As with the age-old “nature versus nurture” debate, AI researchers want to know whether architecture or weights play the main role in the performance of neural networks. In a blow to the “nurture” side, Google researchers have now demonstrated that a neural network which has not learned weights through training can still achieve satisfactory results in machine learning tasks.
(Synced) /(Google Brain)

Episodic Memory in Lifelong Language Learning
Researchers introduce a lifelong language learning setup where a model needs to learn from a stream of text examples without any dataset identifier. They propose an episodic memory model that performs sparse experience replay and local adaptation to mitigate catastrophic forgetting in this setup.

You May Also Like

New Facebook PyTorch Hub Facilitates Reproducibility Testing
In a bid to provide a smoother reproduction experience, Facebook has announced the beta release of PyTorch Hub, a new pretrained model repository designed to facilitate research reproducibility testing.

BatchNorm + Dropout = DNN Success!
A group of researchers from Tencent Technology, the Chinese University of Hong Kong, and Nankai University recently combined two commonly used techniques — Batch Normalization (BatchNorm) and Dropout — into an Independent Component (IC) layer inserted before each weight layer to make inputs more independent.

Global AI Events

June 15-21: Computer Vision and Pattern Recognition in Long Beach, United States

June 20-21: AI for Good Summit in San Francisco, United States

June 28: Research and Applied AI Summit (RAAIS) in London, United Kingdom

August 19-23: Knowledge Discovery and Data Mining (KDD2019) in London, United Kingdom

Global AI Opportunities

Research Scientist, Google Brain Toronto

OpenAI Seeking Software Engineers and Deep Learning Researchers

DeepMind is Recruiting

DeepMind Scholarship: Access to Science

Postdoctoral Researcher (AI) – Self-Supervised Learning

LANDING AI is recruiting

Stay tight with AI! 
Subscribe to Synced Global AI Weekly

1 comment on “ICML Announces Best Paper Awards; What to Expect at CVPR 2019

  1. Pingback: ICML Announces Best Paper Awards; What to Expect at CVPR 2019 - Connected Free

Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: