Lecture 7 - Attention/Transformers in NLP
Teacher: Romain Bielawski (ANITI)
Contents
- Attention in LSTM
- Self attention and transformer
- Embedding : Bert
- Generation : GPT2
- Application : Grammatical correctness classification, text generation, Movie plot generation
Prerequisites:
Knowledge about neural networks principles; knowledge of several NN layer types; gradient descent & backpropagation; basics of NLP; RNN principles
Slides
Download the slides here
Notebook
Access the collab notebook here