Readings

Is BERT a Game Changer in NLP?

Google_office.jpg

BERT  (Bidirectional Encoder Representations from Transformers) is an open-sourced NLP pre-training model developed by researchers at Google in 2018. It has inspired multiple NLP architectures, training approaches and language models, including Google’s TransformerXL, OpenAI’s GPT-2, ERNIE2.0, XLNet, and RoBERTa. 

For instance, BERT is now used by Google Search to provide more relevant results. And it can also be used in smarter chatbots with conversational AI applications, expects Bharat S Raj. 

More...

Site maintained by OW2