Tag: DeepMind

AI Technology

Applying Linearly Scalable Transformers to Model Longer Protein Sequences

Researchers proposed a new transformer architecture called “Performer” — based on what they call fast attention via orthogonal random features (FAVOR).

Advertisements
AI Technology

DeepMind AI Flunks High School Math Test

DeepMind trained and tested its neural model by first collecting a dataset consisting of different types of mathematics problems. Rather than crowd-sourcing, they synthesized the dataset to generate a larger number of training examples, control the difficulty level and reduce training time.