LSTM time sequence GAN implementation

If i want to build a basic LSTM GAN, is that a proper way of implementing it:

  1. Take conditioning vector
  2. .cat() it with a previous output, if it is a first time step, .cat() it with zeros or random tensor
  3. Pass it as input to lstm along with previous hidden, if it is a first time step, init h_0 and c_0 to zeros
  4. Save hidden state (h_n, c_n) into a variable for next time step, take h_n as output of next time step prediction
  5. Go to 1, repeat until desired sequence length is reached

Is that a right algorithm?