I am trying to change the backwards computational graph for a custom model in which use a regular python function to change the weights of the model before performing the forward and backward prop.
I am trying to change the backward computation for the gradient computation of this function. I am familiar with backward hooks and tried doing this with them. That did not work (I printed out the new model with hooks using torchviz to confirm this).
I would like to know if there are any other ways to change the backward computation graph for a model and even better, to achieve my goal.
Thank you in advance for any help!