Proper Recurrent Neural Network Forward Pass

I’m running PyTorch version 1.0.1 and I’m new to this framework so I decided the best way to learn would be doing scratch implementations of other frameworks. I’ve defined my model as such and I’m unsure if I will actually be able to do backpropagation along my H previous?

My model is defined as the following:

class Model(nn.Module):
    def __init__(self):
        super(Model, self).__init__()
        self.Wxh = nn.Parameter(torch.randn([hidden_size, voc_size]))
        self.Whh = nn.Parameter(torch.randn([hidden_size, hidden_size]))
        self.Why = nn.Parameter(torch.randn([voc_size, hidden_size]))
        self.bh = nn.Parameter(torch.randn([hidden_size, 1]))
        self.by = nn.Parameter(torch.randn([voc_size,1]))
        
        self.hp = torch.tensor(torch.zeros([hidden_size, 1]), dtype=torch.float32, device="cuda")
    def forward(self, x):
        h = torch.empty([hidden_size, 0], dtype=torch.float32, device="cuda")
        first = True
        for t in x.split(1, dim=1):
            if not first:
                calc = self.Wxh @ t + self.Whh @ self.hp + self.bh
            else:
                calc = self.Wxh @ t + self.bh
                first = False
            h = torch.cat([h, torch.tanh(calc)], dim=1)
            self.hp = torch.tanh(calc)
        o = torch.clamp(self.Why @ h + self.by, -10, 10)
        return o
    
    def zeroH(self):
        self.hp = torch.tensor(torch.zeros([hidden_size, 1]), dtype=torch.float32, device="cuda")

Thank you and sorry for the relatively simpler question!