Transformers
Know a great resource on Transformers? Send it to us at contactbackprop@gmail.com!
Tags: Neural Networks (MLPs), Layer Normalization, Residual Connections
Lectures
Transformers and Self-Attention
Ashish Vaswani and Anna Huang, Stanford University, Winter 2019
Videos on self-attention, the model itself, and a few famous Transformer models
Attention and Transformer Networks
By Pascal Poupart, Professor at the University Of Waterloo
The Transformer for Language Understanding
A code-based lecture by Rachel Thomas of Fast.AI
Videos
Explains transformers and compares them to RNNs and LSTMs
Walks through and explains the original paper
Generative Adversarial Networks (Paper Explained)
A walkthrough of the original GAN paper
An Illustrated Guide to Transformers
Visual-based walkthrough of the Transformer Model
Posts
A blog post explained transformers with visuals
A detailed walkthrough of attention mechanisms before explaining Transformers
A blog post explaining Transformers step-by-step with pytorch code
An explanation of modern transformers without some of the historical baggage
Explaining Transformers in Q&A format
A detailed walkthrough of different transformers proposed after the original
Code Examples
Tensorflow Transformer Implementation Example
Tensorflow tutorial of Transformer model for translating Portugeuse text to English
Text Classificiation with Transformer
A keras tutorial implementing a transformer block
Sequence-to-Sequence Modeling with Transformers
A Transformer model tutorial in pytorch
APIs
Pytorch API for a transformer model
An API by Google Brain with some Transformer model APIs
An api for state of the art Natural Language Processing tasks in pytorch and tensorflow
Paper for the api
An API built on top of hugging face for state-of-the-art NLP models
Tagged Pages: ALBERT, BART, BERT, Electra, GPT-1, GPT-2, GPT-3, Image Transformer, Longformer, Reformer, RoBERTa, Sparse Transformer, Switch Transformer, Transformer-XL, Universal Transformer, Vision Transformer
Know a great resource on Transformers? Send it to us at contactbackprop@gmail.com!