Tag: Natural Language Processing

AI Nature Language Tech Research

Yann LeCun Team Uses Dictionary Learning To Peek Into Transformers’ Black Boxes

A Yann LeCun team proposes dictionary learning to provide detailed visualizations of transformer representations and insights into semantic structures such as word-level disambiguation, sentence-level pattern formation, and long-range dependency captured by transformers.

AI Machine Learning & Data Science Research Share My Research

UmlsBERT: Clinical Domain Knowledge Augmentation of Contextual Embeddings Using the Unified Medical Language System Metathesaurus

UmlsBERT is a deep Transformer network architecture that incorporates clinical domain knowledge from a clinical Metathesaurus in order to build ‘semantically enriched’ contextual representations that will benefit from both the contextual learning and domain knowledge.