3: RuntimeError Element 0 of tensors does not require grad and does not have a grad_fn

Hi all, I have been doing some reading on this issue online and tried the suggested solutions but with no success,

For some context, I am using a pre-trained model, and I want to use my modified classifier.

This is how I create the model:

model = torchvision.models.vgg11(pretrained=True)
flag = 0
# We want this to happen only for 2 children
for child in model.children():
   if flag < 2:
      for parameter in child.parameters():
         parameter.requires_grad = False
         flag += 1
model.classifier.add_module('lastfc', nn.Linear(1000, 1))
model.classifier.add_module('lastSigmoid', nn.Sigmoid())

I want to freeze features and AvgPool so I won’t have to retrain.

This is my loss and optimizer:

loss_fn = nn.BCELoss()
optimizer = torch.optim.Adam(model.classifier.parameters(), lr=0.001)

And this is inside of the train function:

X, y = X.to(device), y.to(device)
# Compute prediction error

pred = model(X)
pred = (pred > 0.5).float().squeeze()
y = torch.tensor(y, dtype=torch.float32, device=device)
loss = loss_fn(pred, y)

# Backpropagation
optimizer.zero_grad()
loss.backward()
optimizer.step()

But, when I call

loss.backward()

I get the error “Element 0 of tensors does not require grad and does not have a grad_fn”

What am I doing wrong?

As a sanity check, I tried to run in debug session:

model.classifier[0].weight.requires_grad
True

This line of code:

pred = (pred > 0.5).float().squeeze()

will detach pred as the comparison is not differentiable.
Pass the probability directly to nn.BCELoss (or better: remove sigmoid and use nn.BCEWithLogitsLoss for better numerical stability) and it should work.

1 Like