Overview of concept of seq-to-seq. Assumes you know rnn already. Run the char-level training, on a few enlish-french sentences. See trains ok-ish. Brief overview of corresponding code. Go over the challenges I encountered with trying to make a simple rnn model that trains in 30-60 seconds.
Slides: https://docs.google.com/presentation/d/1z9INuS1VX2UigL3WqB60oJCMi_CJuaVRi-Qaun6fPl4/edit?usp=sharing
Source-code: https://github.com/hughperkins/pub-prototyping/blob/df9cf0c9fa473517956c55c33435924a289ddd36/papers/attention/seq2seq_noattention_trainbyparts.py
Experiment log: https://github.com/hughperkins/pub-prototyping/blob/df9cf0c9fa473517956c55c33435924a289ddd36/papers/attention/mt_by_align_translate.log
"Sequence to Sequence Learning with Neural Networks", by Sutskever, Vinyals, Quoc, 2014 https://arxiv.org/abs/1409.3215