Check value of gradient

Hi, I am training a NN with PyTorch and I want to check whether the gradients of the loss total_loss (on which I call total_loss.backward()), are constant with respect to the weights contained in an nn.Sequential block, call it block.
How can I easily achieve this? Thanks!

You can iterate over the model parameters:
for name,param in model.named_parameters() and then inspect param.grad
Which should be either a torch.Tensor or None (if your network is breaking backprop due to any bug).