Language Models Redefined: Transforming Textual Mastery into Compression Brilliance
In a new paper Language Modeling Is Compression, a collaborative team from Google DeepMind, Meta AI, and Inria delves into the lossless compression capabilities of foundation models, unveiling their achievement of state-of-the-art compression rates across various data types.