ReLU the weights at the end of RNN with autograd computing the graident

Why was the weight.grad for me NoneType? Probably it doesn’t work if the forward is in a for loop.