This likely has nothing to do with the single element indexing but that model breaks the graph somewhere (if x requires grad), to check, see if y.requires_grad is True.

P.S.: Note that the terminology in the title is slightly off relative to the code, as it attempts to compute the grad of the single element tensor y[3, 2]w.r.t.x.

import torch
import torch.nn as nn
class Model(nn.Module):
def __init__(self):
super().__init__()
self.fc1 = nn.Linear(2, 4)
def forward(self, x):
return self.fc1(x)
net = Model()
input_x = torch.tensor([1.0, 2], requires_grad=True) # But for requires_grad=True, it will produce your error
out_y = net(input_x)
gradient = torch.autograd.grad(out_y[1], input_x)
print(gradient) # (tensor([0.3005, 0.3967]),)

This shows that you can calculate the gradient of a single element of a tensor (by indexing) wrt some other tensor.

Edit : Just noticed @tom has already suggested the possible reasons as I was writing a reply.

As they’ve suggested, make sure the graph isn’t breaking anywhere.