Suppose I have a very simple lstm model
class model(nn.Module): def __init(...): lstm = self.LSTM() def init_hidden(): #init to zero def forward(self, inputs): lstm_hidden = self.init_hidden() output, (hidden, cell) = lstm(inputs)
then, when i train my model,
lstm_hidden keep initialized in every epoch by
lstm_hidden = self.init_hidden() ?
if so, does it always start with zero hidden weights?