Question on implement Excitation BackPropagation in Pytorch

Hi, I’m trying to implement the following paper in pytorch.
https://arxiv.org/abs/1608.00507

It’s one way to construct saliency map.
For a certain Conv2d layer, we need to truncate the weight W to W+ which only positive weights are remained. Then, they will do a Conv2d_back according to this new weight(together with normalized output). However, in pytorch, the operator torch._C._functions.Conv2d is not allowed to use. Is there any suggestion on how to overcome this and implement this paper ?

Thanks a lot and happy new year !

1 Like

seems like guided backpropogation

1 Like

For solving the inconsistent forward and backward functions in excitation backprop, I implemented two functions with pure conv functions by

Note that this is a temporal solution for pytorch 0.2 version. The full implementation can be found in https://github.com/yulongwang12/visual-attribution. Hope it helps