The Seventh International Conference on Learning Representations (ICLR) kicked off today. One of the world’s major machine learning conferences, ICLR this year received 1591 main conference paper submissions — up 60 percent over last year — and accepted 24 for oral presentations and 476 as poster presentations.
A group of Google researchers led by Quoc Le — the AI expert behind Google Neural Machine Translation and AutoML — have published a paper proposing attention augmentation. In experiment results, the novel two-dimensional relative self-attention mechanismfor image classification delivers “consistent improvements in image classification.”
In 2016 Google’s DeepMind stunned the world when their Go computer AlphaGo secured a historic victory over Korean grandmaster Lee Sedol. Yesterday the UK’s top AI team delivered their latest “wow moment” as their AI system AlphaFold topped the Critical Assessment of Structure Prediction (CASP) competition.
New research from Carnegie Mellon University, Peking University and the Massachusetts Institute of Technology shows that global minima of deep neural networks can been achieved via gradient descent under certain conditions. The paper Gradient Descent Finds Global Minima of Deep Neural Networks was published November 12 on arXiv.
On July 10th, with German Chancellor Angela Merkel and Chinese Premier Li Keqiang looking on, Siemens AG signed a partnership agreement with Alibaba Cloud — the cloud-computing arm of Internet conglomerate Alibaba — to bring an Industrial Internet of Things (IIoT) upgrade to China’s manufacturing industry.
GcForest, a decision tree ensemble approach that is much easier to train than deep neural networks, has received a lot of attention from researchers since it was introduced by Prof. Zhihua Zhou and his student Ji Feng last year. Based on their previous work, Zhou, Feng and Nanjing University colleague Yang Yu have now proposed Multi-layered Gradient Boosting Decision Trees (mGBDTs).