DeepMind trained and tested its neural model by first collecting a dataset consisting of different types of mathematics problems. Rather than crowd-sourcing, they synthesized the dataset to generate a larger number of training examples, control the difficulty level and reduce training time.
To boost learning research aimed at endowing robots with better generalization capabilities, Yi Wu from UC Berkeley and Yuxin Wu, Georgia Gkioxari, and Yuandong Tian from Facebook AI research recently published the paper Building Generalizable Agents with a Realistic and Rich 3D Environment.
Compared to SMT, NMT can train multiple features jointly and does not need prior domain knowledge, enabling zero-shot translation. In addition to higher BLEU score and better sentence structure, NMT can also help reduce morphology errors, syntax errors, and word order errors of SMT.
We explore top-notch Swiss AI facilities: starting with deep learning and neural network research at IDSIA in Lugano, to interdisciplinary research at École Polytechnique Fédérale de Lausanne and University of Basel, and ending with robotics innovations at ETH in Zurich and University of Zurich.