If you use autograd.backward(), it populates the .grad field of all the leaf Tensors (all tensors created with requires_grad=True and nn.Parameter).
So since you use an nn.Module, you can do:
for p in self.model.parameters():
print(p.grad)
Otherwise, you can use autograd.grad to get a list of Tensor for your inputs by calling grads = autograd.grad(minibatch_loss, self.model.parameters()).