Lecture 8 - Attention/Transformers in NLP

Teacher: Romain Bielawski (ANITI)

Lecture video

View the recorded lecture here (this will only be available for approximately 6 weeks after the course)

Contents

Prerequisites:

Knowledge about neural networks principles; knowledge of several NN layer types; gradient descent & backpropagation; basics of NLP; RNN principles

Slides

Download the slides here

Notebook

Access the collab notebook here

Further reading:


(Back to Main Page)