Israeli research company AI21 Labs today published the paper SenseBERT: Driving Some Sense into BERT, which proposes a new model that significantly improves lexical disambiguation abilities and has obtained state-of-the-art results on the complex Word in Context (WiC) language task.