Category: Nature Language Tech

Machine Learning & Data Science Nature Language Tech Research

Google Brain’s Switch Transformer Language Model Packs 1.6-Trillion Parameters

Google Brain’s Switch Transformer language model packs a whopping 1.6 trillion parameters while effectively controlling computational cost. The model achieved a 4x pretraining speedup over a strongly tuned T5-XXL baseline.