Lecture 7 - Attention/Transformers in NLP

Teacher: Romain Bielawski (ANITI)

Contents

Prerequisites:

Knowledge about neural networks principles; knowledge of several NN layer types; gradient descent & backpropagation; basics of NLP; RNN principles

Slides

Download the slides here

Notebook

Access the collab notebook here

Further reading:


(Back to Main Page)