Element 0 of tensors does not require grad and does not have a grad_fn How can i fix this

train

device = get_device()
print(device)
def train(train_set):
    epochs = 1000

    learning_rate = 0.01
    optimizer = torch.optim.SGD(model.parameters(),lr=learning_rate)
    loss_f = nn.MSELoss()
    epoch = 0
    while epoch < epochs:
        for input,label in train_set:
            optimizer.zero_grad()
            input = input.to(device)
            label = label.to(device)
            print(input.shape)
            print(label.shape)
            output = model(input)
            print(output.shape)
            loss = loss_f(output,label)

            loss.backward()
            optimizer.step()

        print("epoch:{},loss:{}".format(epoch,loss))
        epoch+=1

output

cuda
torch.Size([270, 93])
torch.Size([270])
torch.Size([270])

How can i fix this

This error is raised if the model output or loss has been detached from the computation graph e.g. via:

  • using another library such as numpy
  • using non-differentiable operations such as torch.argmax
  • explicitly detaching the tensor via tensor = tensor.detach()
  • rewrapping the tensor via x = torch.tensor(x)

or if the gradient calculation was disabled in the current context or globally such that no computation graph was created at all.

To debug this issue, check the .grad_fn attribute of the loss, model output, and then the intermediate activations created in the forward method of your model and make sure they are returning a valid function name. If None is returned it means that this tensor is not attached to any computation graph.

2 Likes
x = torch.tensor([[1., -1.], [1., 1.]], requires_grad=True)
out = x.pow(2).sum()
out.backward()
x.grad

this is a example on torch.Tensor — PyTorch 1.10.1 documentation

the output is tensor([[ 2.0000, -2.0000], [ 2.0000, 2.0000]])

but my pytorch also have the same error
element 0 of tensors does not require grad and does not have a grad_fn

I 've found out the problem,thanks!!!

If the posted code snippet is still raising the issue I guess you’ve disabled autograph globally or what was the issue?

H Lynn,
I’m trying to run the same piece of code but I’m getting the “element 0 …” error.
how did you solve this issue?