I want to make a loss function that depends on the label (some dimensions are somehow co-linear). For example, I would like to weight the loss of feature 2 by the norm of feature 1. The issue here is that autograd says that it cannot compute the gradient
RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation: [torch.cuda.FloatTensor [20, 224, 224]] is at version 2; expected version 1 instead. Hint: enable anomaly detection to find the operation that failed to compute its gradient, with torch.autograd.set_detect_anomaly(True).
for the code
outputs[:, 1] *= labels[:,0] labels[:, 1] *= labels[:,0] loss_parameters = F.smooth_l1_loss(outputs, labels)
Do you have any idea on how to make that happen ?