Can I backward without forward after an optimizer.step()

I have a simple resnet. When training, I call forward to get output and calculate the loss. Then I backward the loss to get the gradients of weight and call optimizer.step() to train the net. Here, everything is OK. But then I want to call output.backward() again to get the gradients of inter-results of resnet, such as the output feature map of the last CNN layer (This is used in Grad_CAM). Here, I faced the problem: " RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation". I guess this is because the weights of each layer have been modified by the optimizer and this is an inplace operation. So anther backward will fail.

I know that a simple solution is to do the forward again. But this will lead to a slow training process. Is there a better solution to this problem, such as is there a method to store the gradients of two backward separately?

Hi,

Another solution is to compute these intermediate gradients before doing the step of the optimizer no?