Creating Shakespeare-like content with LSTM

Dear Pytorch Community,

I am new to Pytorch and neural networks in general. After reading Andrej Karpathy blog entry on recurrent neural networks I tried to rebuild his Shakespeare example with Pytorch and the use of some of the official Pytorch tutorials.
However the tests I do so far do no generate any usefull output.
I would be happy for any improvement recommendations of my [coding on github] (
The basic idea is to read all words of a data set with Shakespeare texts, assign an integer to each letter. The model then consists of a LSTM layer and a linear layer meant to predict a respective next letter.

Your assistance is highly appreciated

Was your attempt successful.