If i want to build a basic LSTM GAN, is that a proper way of implementing it:
- Take conditioning vector
-
.cat()
it with a previous output, if it is a first time step,.cat()
it with zeros or random tensor - Pass it as input to lstm along with previous hidden, if it is a first time step, init h_0 and c_0 to zeros
- Save hidden state (h_n, c_n) into a variable for next time step, take h_n as output of next time step prediction
- Go to 1, repeat until desired sequence length is reached
Is that a right algorithm?