I have a Function whose only job is to negate its gradient. After upgrading to 0.20 I realize the backward function is no longer being called. Has there been a change in the behaviour of the computational graph?
If I change return input to return input+1e-10, then the backward function would be correctly called again. If this is some newly introduced graph optimization, can someone please point me to the documentation?
The error comes from the fact that since the output of your forward pass shares the same storage as the input, you should mark the input as “dirty” with the mark_dirty() function: add self.mark_dirty(input) in your forward pass.
You can also change this to use the new style for Function:
Be careful though that the new style functions (that use static methods) are used slightly differently: you should use the .apply() method instead of just calling an instance with the arguments of your forward pass.
In your particular case you should change neg_grad_b = NegateGradient()(b) to neg_grad_b = NegateGradient.apply(b).
What would happen if one does that? Should it break or the behaviour would be different from the one we expect? Because I use 0.2 and have multiple calls like this in my code.
That means that somewhere when you use this function, you perform inplace operations either on the input or the output. And since you use the mark_dirty() function, the autograd engine is now able to detect this problem and raise an error.