Hi at all,
I’m facing this error
RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn when I try to make the backward pass on the training of a ResNet + deconvolutional layer. I explain better the situation:
I define my model using a pre-trained ResNet-50 in which I removed the last 2 blocks layer (the error remains also if I use the complete model). After the ResNet blocks instead of the fully connected layer I use a deconvolutional layer (conv-transpose) because in my task I have to solve a regression problem.
After the deconvolutional layer I perform some numpy operations to extract the prediction of my model. The workflow is this:
result = self.resnet_encoder(x) # the resnet layers result = self.decoder(result) # the convolutional-transpose layer output = # a series of basic numpy operations here
Now if I define my loss function (a MSE) on the result tensor, all works good: the model is able to do the backpropagation pass, but if instead I define the loss on the output tensor (even if the loss is calculated correctly) I face the error that I wrote above.
The loss is defined like this:
loss_fun = nn.MSELoss() # result = my model result prediction # result_dataset = could be seen as a ground truth expectation of the result decoder_loss = loss_fun(result, torch.from_numpy(result_dataset)) # output = my model output prediction # output_dataset = ground truth output expectation output_loss = loss_fun(torch.from_numpy(output), torch.from_numpy(output_dataset))
The error appears when I tried to do:
output_loss.backward() # decoder_loss.backward() => this instead works correctly
The fact is that my labels are based on the result of the output tensor so I need to find a way to compute the loss and do the backprop pass basing on that value.
Thank you so much.