hidden - which was obtained from the previous sequence. that is, do not make the new hidden, but use the previous one each time? The fact is that I am exploring a continuous sequence and it seems to me that hidden should be continuous.

I’m not sure to understand the question clearly, but it seems you would like to initialize the hidden state once and then just use if for the whole training?

def forward(self, out):
out = self.fc1(out)
out = torch.transpose(out,0,1)
hidden = self.__init__hidden(batch)
out,hidden = self.gru(out, hidden)
out = self.fc3(hidden)
out = out.reshape(batch,3)
return out
def __init__hidden(self, batch):
hidden = torch.randn(1, batch, 512).to(device)
return hidden

When I call outputs = net (wn), def forward is triggered.
This happens for every sequence in batch.

And each time we initialize the hidden variable, which calls def __init__hidden. I want def __init__hidden to work the first time, and for the second and next passes hidden should be remembered from the previous sequence (out,hidden = self.gru(out, hidden)).