Hi. While training with CIFAR-10 dataset and ResNet-20, I want to change the activation value in backpropagation, but I got the following error.
RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation: [torch.cuda.FloatTensor [128, 16, 32, 32]], which is output 0 of ReluBackward1, is at version 2; expected version 1 instead. Hint: enable anomaly detection to find the operation that failed to compute its gradient, with torch.autograd.set_detect_anomaly(True).
Can’t I change the activation value saved in the forward path during the backward path?
You can change it. But if you change it inplace and something else needed the original value (the relu in this case), then the relu cannot compute its backward anymore. Hence the error.
I want to see the code how to calculate the gradient from an existing built-in function(for example, torch.nn.Conv2d). If that’s possible, Maybe i can change the activation value to whatever i want in calculating the gradient