Positional Embeddings

Positional Embeddings

Aug 15, 2020 by Rutvik

The Transformers architecture was proposed as a purely attention based sequence to sequence model. Due to its ability of processing the text parallely and its advantage of solving the long sequence dependency problem of RNNs, this architecture has become very popular among researchers for performance improvements. This article discusses about a special part of the architecture, which is the Positional Embeddings.

Continue Reading
Tokenizer

Introduction to Tokenizers - 1

Aug 08, 2020 by Rutvik

Natural Language Processing, or NLP, for short can be explained as the automated manipulation, or processing of language. This article will walk you through an important concept in NLP, that is Tokenization and different methods and types of tokenizers used.

Continue Reading

SO WHAT DO YOU THINK ?

Did you find the articles interesting?
Please share your thoughts with me

Contact Me