Summer Learning

Week 0: A nice introduction to neural attention. Second attention mechanisms blog. Third blog on attention on NN's. A link to the STN Tutorial. A link to the Seq2seq with Attention Tutorial.
Week 0 suggested exercises: 1. Include a STN module in a CIFAR10 classification model. 2. Try another language pair in the Seq2Seq with attention model. Here is a Website with many bilingual sentence pairs. Also experiment with the hyperparameters and the structure of the model.

Week 1: The paper introducing the Transformer, seq2seq based on attention only. Annotated PyTorch code. The tensor2tensor depository which contains a TensorFlow implementation.
Week 1 suggested exercise: Understand the TensorFlow implementation on T2T.

Week 2: Two papers on Encoder-Decoder with attention HMER: WAP and multi-scaled attention. The CROHME datasets: 2016 and 2014. Note that there are CROHME data extractors on github. Also on github you can find a bunch of implemented im2txt models.

Week 3: Few papers using STN's in one way or another: scene text recognition, dense transformers and hierarchical STN. A paper on fine-grained attention.