Lecture 6 - Recurrent Neural Networks for text processing
Teacher: Romain Bielawski (ANITI)
Contents
- Sequential data and variable size inputs
- Recurrent neural network principles, hidden states
- Backprop through time
- RNN, LSTM, GRU
- Sentence embeddings, text generation, seq2seq, machine translation
- Application : LSTM for text generation
Slides
Download the slides here
Notebook
Access the collab notebook here
Prerequisites:
Knowledge about neural networks principles; knowledge of several NN layer types; gradient descent & backpropagation; basics of NLP