Google Extends Transformers for Immediate Knowledge Acquisition via a Simple New Data Read & Memorize Technique
A Google research team addresses conventional transformers’ resource-heavy training and fine-tuning requirements for learning new knowledge, proposing Memorizing Transformers as a step toward language models that can simply read and memorize new data at inference time for immediate knowledge acquisition.