How to compute grad w.r.t. an single element of tensor

Hi there,

Consider this scenario.

y = model(x)  # shape of y: B * C
gradd = torch.autograd.grad(y[3, 2], x)

What I want to do is to take the grad of y[3, 2], w.r.t. x. However when I write as above, it raises

RuntimeError
element 0 of tensors does not require grad and does not have a grad_fn

Any suggestions. Thank you so much!

This likely has nothing to do with the single element indexing but that model breaks the graph somewhere (if x requires grad), to check, see if y.requires_grad is True.

P.S.: Note that the terminology in the title is slightly off relative to the code, as it attempts to compute the grad of the single element tensor y[3, 2] w.r.t. x.

1 Like

Hi @phantom90,
Please see the following :

import torch
import torch.nn as nn
class Model(nn.Module):
  def __init__(self):
    super().__init__()
    self.fc1 = nn.Linear(2, 4)

  def forward(self, x):
    return self.fc1(x)

net = Model()
input_x = torch.tensor([1.0, 2], requires_grad=True) # But for requires_grad=True, it will produce your error
out_y = net(input_x)
gradient = torch.autograd.grad(out_y[1], input_x)
print(gradient) # (tensor([0.3005, 0.3967]),)

This shows that you can calculate the gradient of a single element of a tensor (by indexing) wrt some other tensor.

Edit : Just noticed @tom has already suggested the possible reasons as I was writing a reply.

As they’ve suggested, make sure the graph isn’t breaking anywhere.

1 Like