The White House on Monday joined a number of research groups to announce the release of the COVID-19 Open Research Dataset (CORD-19) of scholarly literature about COVID-19, SARS-CoV-2, and the Coronavirus group.
Researchers proposed an automatic structured pruning framework, AutoCompress, which adopts the 2018 ADMM-based weight pruning algorithm and outperforms previous automatic model compression methods while maintaining high accuracy.
Proposed by researchers from the Rutgers University and Samsung AI Center in the UK, CookGAN uses an attention-based ingredients-image association model to condition a generative neural network tasked with synthesizing meal images.
The crowdsourcing produced 111.25 hours of video from 54 non-expert demonstrators to build “one of the largest, richest, and most diverse robot manipulation datasets ever collected using human creativity and dexterity.”
In a bid to raise awareness of the threats posed by climate change, the Mila team recently published a paper that uses GANs to generate images of how climate events may impact our environments — with a particular focus on floods.
Google teamed up with researchers from Synthesis AI and Columbia University to introduce a deep learning approach called ClearGrasp as a first step to teaching machines how to “see” transparent materials.
Researchers from Google Brain and Carnegie Mellon University have released models trained with a semi-supervised learning method called “Noisy Student” that achieve 88.4 percent top-1 accuracy on ImageNet.
The Godfathers of AI and 2018 ACM Turing Award winners Geoffrey Hinton, Yann LeCun, and Yoshua Bengio shared a stage in New York on Sunday night at an event organized by the Thirty-Fourth AAAI Conference on Artificial Intelligence (AAAI 2020).
Batchboost is a simple technique to accelerate ML model training by adaptively feeding mini-batches with artificial samples which are created by mixing two examples from the previous step – in favor of pairing those that produce the difficult one.