Archive
Blog - Readings - posts for November 2019
Nov 05 2019
Nov 5, 2019,
Is BERT a Game Changer in NLP?
BERT (Bidirectional Encoder Representations from Transformers) is an open-sourced NLP pre-training model developed by researchers at Google in 2018. It has inspired multiple NLP architectures, training approaches and language models, including Google’s TransformerXL, OpenAI’s GPT-2, ERNIE2.0, XLNet, and RoBERTa.
For instance, BERT is now used by Google Search to provide more relevant results. And it can also be used in smarter chatbots with conversational AI applications, expects Bharat S Raj.