Hi, I want to create a custom layer that has a backward pass that isn’t the gradient of it’s forward pass, but is instead a pytorch autograd of a different forward pass. In addition, I want the layer to have saved variables. I’m wondering if there is an example of a good way to do this.
The only way I can think to do this is to make an nn.module extension that stores the saved variables, have this call an extension of the autograd.funcion class, and have this autograd function perform a backward pass by performing a forward pass and then autograd of another non.module extension. Which doesn’t seem very elegant.