Different gradient mechanism in place replacement

I would like to replace the autograd mechanism with my own custom gradient calculations. Is there an easy way to do that in pytorch? I am looking for some examples where the gradient calculations can be supplied by another mechanism. My DNN is very simple-fully connected with RELU and I want to run Adam with my own gradient mechanism.

Thank you.

Hi, to custom gradient calculation, you can define a new layer by inheriting the Function.
Here is an example PYTORCH: DEFINING NEW AUTOGRAD FUNCTIONS,
You can find more detail about the Function at here.

Thanks I had considered that. I assume there is no drop in replacement for autograd and pytorch and autograd are tightly integrated on the python end. Is there a way to do this on the C++ side of things?

Of course, you can custom op in c++ and then build it into a shared library.
Here is an example Using the TorchScript Custom Operator in C++