How do I get the gradient w.r.t. the model parameters by using hooks?

It seems that the callbacks registered with register_backward_hook only receive gradients w.r.t. the module output and the inputs to the last operation (or some such thing), as detailed here.

Now what one would expect during the backward pass would be the gradient with respect to the module’s parameters, e.g. weight and bias, but no such thing is available.

The gradients are stored in the grad property of the module parameters after loss.backward() has been called, but I’d like to get them via hooks somehow, as I do all other information.

How can this be done?

It turns out instead of calling register_backward_hook() on modules, one can check whether they have weight and/or bias properties and call register_hook() on those. The callback will receive the gradient with respect to the tensor.

how to print the gradients of loss wrt intermediate layers in neural network?

Just read the initial post; use register_backward_hook()

i am getting an error saying that “Expected tuple but got Tensor”

ERROR : "expected tuple, but hook returned ‘Tensor’ "