Transformers – Attention is all you need

Transformers are getting more and more important not just in NLP but now its going to extend its surface area into other areas of deep learning beyond just language. Google has rolled out BERT and transformer based models to google search, they have been using them to empower google search and they call it one […]

Convolutional Sequence to Sequence Learning

Traditionally, Recurrent neural networks (RNNs) with LSTM or GRU units are the most prevalent tools for NLP researchers, and provide state-of-the-art results on many different NLP tasks, including language modeling (LM), neural machine translation (NMT), sentiment analysis, and so on. However, a major drawback of RNNs is that since each word in the input sequence […]

Sequential Data Processing in NLP

We humans have an amazing ability to rapidly interpret and put words into context while we exchange our thoughts and interact with others and the credit goes to the best computer we have ever know : A Human Brain. Over the years Scientists have carried out various research and have found that it involves a huge […]