Accessing gradients directly from the loss

Is there a way to access the gradients directly from the loss, without using an optimizer?

For example,

inputs, labels = inputs.half().to(device=self.args.device, non_blocking=True), labels.to(device=self.args.device, non_blocking=True)
outputs = self.model(inputs)
minibatch_loss = self.criterion(outputs, labels)
minibatch_loss.backward()

Can I access the gradients from

minibatch_loss

somehow ?

Hi,

If you use autograd.backward(), it populates the .grad field of all the leaf Tensors (all tensors created with requires_grad=True and nn.Parameter).
So since you use an nn.Module, you can do:

for p in self.model.parameters():
  print(p.grad)

Otherwise, you can use autograd.grad to get a list of Tensor for your inputs by calling grads = autograd.grad(minibatch_loss, self.model.parameters()).