Hey, currently using Keras over Tensorflow and running into limitations.
I need to create an LSTM that can take a sequence, produce an outputted next step in the sequence, then use the output as part of the next input, for n cycles, prior to updating weights and comparing the n produced outputs against training data. Is it possible to do so with Torch?
(Picture included for clarity)
Yes, this is very much possible.
Just write this looping as a python program, and call backward on the final Variable.
Thank you, I’ll start getting familiar with PyTorch then!
When you publish your paper, you should call this Nostradamus-Net or Fortune-Teller-Net LOL
I take it you don’t believe it’s a reasonable direction, then?
I believe in running as many experiments as one can think of, and this is certainly a very interesting experiment. If it works there are several possible applications for this type of LSTM.
Lately lot of Arxiv papers have really funny names, so I was merely suggesting some names like that
Ah, cheers! I spend too much time on the internet and interpret everything as sarcasm.