Recurrent Batch Normalization Implementation in Pytorch

I want to use the implementation of RBN by jihunchoi which I found on github : recurrent-batch-normalization-pytorch/bnlstm.py at master · jihunchoi/recurrent-batch-normalization-pytorch · GitHub
I had a question regarding the code snippet in forward method of LSTM class:
if hx is None:
hx = (Variable(nn.init.xavier_uniform(weight.new(self.num_layers, batch_size, self.hidden_size))),
Variable(nn.init.xavier_uniform(weight.new(self.num_layers, batch_size, self.hidden_size))))

I get my compiler complaining about the fact that weight is not defined (indeed, it is not defined…). I’m not sure about how to fix the problem - since I don’t see what was intended to do by this code snippet(and especially by using weight.new() in it). Any help on explaining this code snippet and how to fix it would be very much appreciated!

Since weight seems to be undefined, replace it with either torch.empty or torch.randn with the desired shapes and make sure the dtype (float32 by default) and the device is as expected.
Also, note that Variables are deprecated since PyTorch 0.4, so you can use tensors now.