Microsoft’s Parameter-Efficient Z-Code++ Language Model Beats the 200x Larger GPT3-175B on Abstractive Text Summarization
In the new paper Z-Code++: A Pre-trained Language Model Optimized for Abstractive Summarization, a research team from Microsoft Azure AI and Microsoft Research presents Z-Code++, a novel encoder-decoder pretrained language model optimized for abstractive summarization that significantly improves performance on low-resource summarization tasks.