I would like to replace the autograd mechanism with my own custom gradient calculations. Is there an easy way to do that in pytorch? I am looking for some examples where the gradient calculations can be supplied by another mechanism. My DNN is very simple-fully connected with RELU and I want to run Adam with my own gradient mechanism.
Thanks I had considered that. I assume there is no drop in replacement for autograd and pytorch and autograd are tightly integrated on the python end. Is there a way to do this on the C++ side of things?