Can't access intermediate pre-activation values in a network

Hi all,
I am trying to get intermediate layer values for a pretrained network, say VGG19. The main issue is to access pre-activation values i.e in the case of ReLU, I need both the positive and negative values just before the activation layer.

  1. I am able to attach forward hooks and then do a single forward pass(after attaching all forward hooks to reqd layers) but that gives me only positive values ( I suspect this is because of the in-place ReLU operation but not sure) except for the last layer where no ReLU follows it.
  2. In another implementation, if I try to do a forward pass after every register_forward_hook, I am able to get the negative values as well, but I am not able to get gradients with respect to the image( I need those as well). I suspect that it might be due to multiple forward passes being done, but would like some clarification if wrong.

Any help would be appreciated. Do let me know if some code is required due to any ambiguity in the question asked. Thanks.

  1. You are most likely right and the inplace ReLU would manipulate the output of the incoming activation. You could copy-paste the VGG implementation and change these and these lines of code to nn.ReLU or alternatively replace them manually in the model.

  2. I’m not sure I understand the question correctly. The gradients would be calculated for the parameters by default (not the activations). Which gradients would you like to calculate?