Hi,
I am currently working on a problem of gradient pruning.
In the context of convolution computation, during the backpropagation, I need to alter the gradient of the weight (grad_weight) computation by slicing the gradient of the output (grad_output) before doing the actual gradient computation.
So to achieve this goal I need acces to the functions performing backpropagarion computation for convolution.
I don’t want to use the functions conv2d_input and conv2d_weight because they are slow. Binding the function convolution_backward does not solve my problem since I also compute the gradient of the input (grad_input) without slincing grad_output.
I want to use CUDA/cuDNN function to do the job.
My best guess so far was this example on github:
This example uses cudnn_convolution_backward_weight and cudnn_convolution_backward_input as function to compute the associated gradient.
But, from what I understand theses functions have been removed from the ATen API (for code rule compliance)and are no longer accessible (they are not anymore written inside native_functions.yaml)
My point is I really really need to have a fine control over the grad_input and grad_weight (and grad_bias) computation, So how I can do that?
More generaly, how to make accessible for python/torch binding functions that are not listed inside native_functions.yaml?
Sorry for the long post.
Thanks in advance for your answers and guidance.