Subscribe to Synced Global AI Weekly
AI @Facebook F8 | Self-Supervision, Fairness, Inclusivity And PyTorch 1.1
On the second day of its annual F8 developer conference, executives from the social media giant framed AI as a weapon in Facebook’s battle against objectionable content. “Our goal is to reduce the prevalence by taking action on violent content proactively with few minutes,” said Facebook CTO Mike Schroepfer.
(Synced) / (F8 Keynotes)
Everything Facebook Announced at F8 2019
Facebook Open-Sources Ax And BoTorch to Simplify AI Model Optimization
At its F8 developer conference, Facebook today launched Ax and BoTorch, two new open-source AI tools. BoTorch, which, as the name implies, is based on PyTorch, is a library for Bayesian optimization. That’s a pretty specialized tool. Ax, on the other hand, is the more interesting launch, as it’s a general-purpose platform for managing, deploying and automating AI experiments.
Facebook Is Doubling Down on AI to Clean Up The Social Network
On Monday, Facebook’s chief technology officer, Mike Schroepfer, tested my ability to tell the difference between broccoli and marijuana. He showed me two pictures of green blobs and asked if they depicted thecruciferous vegetable or the mind-altering plant. I guessed both were cannabis; I was wrong. One, apparently, was an image of tempura broccoli.
Biological Structure And Function Emerge from Scaling Unsupervised Learning to 250 Million Protein Sequences
Learning the natural distribution of evolutionary protein sequence variation is a logical step toward predictive and generative modeling for biology. To this end researchers use unsupervised learning to train a deep contextual language model on 86 billion amino acids across 250 million sequences spanning evolutionary diversity.
(New York University & Facebook AI Research)
Google Researchers Add Attention to Augment Convolutional Neural Networks
A group of Google researchers led by Quoc Le — the AI expert behind Google Neural Machine Translation and AutoML — have published a paper proposing attention augmentation. In experiment results, the novel two-dimensional relative self-attention mechanismfor image classification delivers “consistent improvements in image classification.”
(Synced) / (Google Brain)
Neural Logic Machines
The research team proposes the Neural Logic Machine (NLM), a neural-symbolic architecture for both inductive learning and logic reasoning. NLMs exploit the power of both neural networks—as function approximators, and logic programming—as a symbolic processor for objects with properties, relations, logic connectives, and quantifiers.
(Tsinghua University & Google & Bytedance)
You May Also Like
The Big Picture: Google Releases Massive Landmark Recognition Dataset
Google today announced the release of a new and improved landmark recognition dataset. Google-Landmarks-v2 includes over 5 million images, doubling the number in the landmark recognition dataset the tech giant released last year. The dataset now covers more than 200 thousand different landmarks, a seven times increase over the first version.
Buildup to Microsoft Build 2019
Microsoft Build 2019 is around the corner. From May 6 to 8, developers and software engineers will fill Seattle’s Washington State Convention Center, where Microsoft is expected to announce updates to Windows, Office 365, its Azure cloud computing platform, and other company platforms and services
Global AI Events
May 6-8: Microsoft Build in Seattle, United States
May 7-9: Google I/O in Mountain View, United States
May 23-24: Deep Learning Summit in Boston, United States
June 4-7: Amazon re:MARS in Las Vegas, United States
Global AI Opportunities
Research Scientist, Google Brain Toronto
OpenAI is looking for software engineers and deep learning researchers
DeepMind Scholarship: Access to Science
Postdoctoral Researcher (AI) – Self-Supervised Learning
0 comments on “Big Announcements from Facebook F8”