An actual leaf tensor is reported " 'inputs' to backward is not a leaf Tensor" by pytorch

Hi,

I’m trying to compute the gradient of input data. I set requires_grad= True and then use autograd.backward(). Then pytorch says “One of the differentiated Tensors given as ‘inputs’ to backward is not a leaf Tensor”. But inputs.is_leaf is True. So what’s going on here? Any idea?

Here is my code to reproduce. (Run in colab)

import torch
import torch.nn as nn
import torch.autograd as autograd

def linear_block(in_channel, out_channel):
    block = nn.Sequential(
        nn.Linear(in_channel, out_channel), 
        nn.Tanh()
    )
    return block

class FCNet(nn.Module):
  def __init__(self, layers=[2, 10, 1]):
    super(FCNet, self).__init__()
    
    fc_list = [linear_block(in_size, out_size) for in_size, out_size in zip(layers, layers[1:-1])]
    fc_list.append(nn.Linear(layers[-2], layers[-1]))
    self.fc = nn.Sequential(*fc_list)

  def forward(self, x):
    return self.fc(x)

layers = [2, 10, 1]

xt = torch.randn((100, 2))
model = FCNet(layers)

xt.requires_grad = True
ys = model(xt)

print(xt.is_leaf)
autograd.backward(tensors=ys.sum(), inputs=xt, create_graph=True)
grad_xt = xt.grad

It asks for a sequence of tensors, not for a tensor.

autograd.backward(tensors=[ys.sum()], inputs=[xt], create_graph=True)

The system is probably iterating over the tensor and the exception is not handled.

1 Like

You’re right. Thanks!