Can I write my own gradients backward function?
More accurately, I just want to change a small part of the original Pytorch backward of a certain forward orgorithm, where can I see the code of the certain forward orgorithm’s backpropagation steps?
Yes, you can create custom autograd.Function
s as described here and can find the backward definitions in tools/autograd/derivatives.yaml
or by searching for the kernel name in the repository.
1 Like
Thanks for your reply! I check the derivatives.yaml file and I can find the function name which I want to rewrite. But I still cannot find the whole math calculations. Where do these concrete calculations of the backpropagation steps?
The derivative should either point to a few native ops or to the actual backward kernel, which you could then search for.