Lecture 8 - Attention/Transformers in NLP
Teacher: Romain Bielawski (ANITI)
Lecture video
View the recorded lecture here (this will only be available for approximately 6 weeks after the course)
Contents
- Attention in LSTM
- Self attention and transformer
- Embedding : Bert
- Generation : GPT2
- Application : Grammatical correctness classification, text generation, Movie plot generation
Prerequisites:
Knowledge about neural networks principles; knowledge of several NN layer types; gradient descent & backpropagation; basics of NLP; RNN principles
Slides
Download the slides here
Notebook
Access the collab notebook here