I am trying to get intermediate layer values for a pretrained network, say VGG19. The main issue is to access pre-activation values i.e in the case of ReLU, I need both the positive and negative values just before the activation layer.
- I am able to attach forward hooks and then do a single forward pass(after attaching all forward hooks to reqd layers) but that gives me only positive values ( I suspect this is because of the in-place ReLU operation but not sure) except for the last layer where no ReLU follows it.
- In another implementation, if I try to do a forward pass after every register_forward_hook, I am able to get the negative values as well, but I am not able to get gradients with respect to the image( I need those as well). I suspect that it might be due to multiple forward passes being done, but would like some clarification if wrong.
Any help would be appreciated. Do let me know if some code is required due to any ambiguity in the question asked. Thanks.