I used pytorch tensors for all my operations in forward which means backward will be calculated automatically.
However, if I want change a parameter in my layer over time, I need to be able to do that after backward is completed. I shouldn’t write custom backward since autograd will calculate gradients for me.
How can I use autograd and at the same time make my custom layer tweak parameters over time during the training process.