How nn.LSTM is trained?

Suppose I have a very simple lstm model

class model(nn.Module):
    def __init(...):
        lstm = self.LSTM()

    def init_hidden():
        #init to zero

    def forward(self, inputs):
        lstm_hidden = self.init_hidden()
        output, (hidden, cell) = lstm(inputs) 

then, when i train my model,
does lstm_hidden keep initialized in every epoch by lstm_hidden = self.init_hidden() ?
if so, does it always start with zero hidden weights?

In your code the hidden state is initialized to zeros in each forward pass, i.e. in every iteration.