All you need to know about Positional encodings in Transformer

Photo by on

In RNN, LSTM the words are fed in sequence, and hence it understands the order of words. Recurrence in LSTM will require a lot of operations as the length of the sentence increases. But in transformer, we process all the words in parallel. This helps in decreasing the training time…

This blog post will get into the nitty-gritty details of the Attention mechanism and create an attention mechanism from scratch using python

Photo by on

Before beginning this blog post, I highly recommend visiting my earlier blog post on an overview of transformers. To get the best out of this blog, please check my previous blog post in the following order.

1 - Drawing the bounding box around the largest object in an Image. It is about getting the Image Data ready for analysis.

Welcome to the Part 2 of where we will deal with Single Object Detection . Before we start , I would like to thank and for their efforts to democratize AI.

This part assumes you to have good understanding of the Part 1. Here are…

Datathon= Data +Hackathon. Superb initiative by .

“A big idea is great, but putting that big idea into action has the power to change the world.”-Tae Yoo

The above quote shows exactly the same passion, grit and tenacity of the top 20 finalist teams out of 6000 participants that I had an opportunity to interact with, in…

Ashis Kumar Panda

Democratizing AI | Looking for challenging problems | Connect @

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store