Hi! I’m new here. I find some similar questions, but I can’t find a good way to get the gradients when I backpropagate from the loss of a minibatch of size N.
Here’s the code:
Scores = model(X_var)
loss = -1 * torch.log§
for i in range(num_train):
loss[i].backward(retain_graph = True)
grads = X_var.grad
X_var is of shape (N, 3, H, W), and the loss is of shape(N,), how can I get the gradients of X_var directly with one line of code rather than a for loop?