Hey all !
This is Nikolas from AI Summer.
I am pretty interested in self-attention and transformers in computer vision. I have started an open source project to collect my process of re-implementing different modules in self attention and transformers architectures in computer vision.
If there is anybody that is interested in the same stuff please do let me know.
Here is a list of articles that I wrote:
- How Attention works in Deep Learning: understanding the attention mechanism in sequence models
- How Transformers work in deep learning and NLP: an intuitive introduction
- How the Vision Transformer (ViT) works
- Understanding einsum for Deep learning: implement a transformer with multi-head self-attention from scratch
Based on all my research from zero to hero for self attention I also created this repo:
Would love to hear that more people are interested to re-implement papers in this direction.
Have a fab. day!