How to print gradient for each layer weight/bias in network?

Hello there!

I am having an issue printing the computed gradients for the following network:

Policy()
{
   affine1 = register_module("affine1", torch::nn::Linear(10, 20));
   affine2 = register_module("affine2", torch::nn::Linear(20, 5));
}

I am trying to print the gradients as follows:

std::cout << "Gradient before: " << policy->affine1->weight.data().grad() << std::endl;
loss.backward();
std::cout << "Gradient after: " << policy->affine1->weight.data().grad() << std::endl;

However, the output I get is the following:

Gradient before: [ Tensor (undefined) ]
Gradient after: [ Tensor (undefined) ]

Does anyone know why this happens?
What would be the correct way to print the data I am looking to visualize?

Thank you very much.

Did you solve your problem? I met Tensor (undefined) problem also while I push_back TensorList to a vector and print the element of vector (which is input TensorList).

Hi @Keunhoi_An,

No, I unfortunately did not solve the problem :confused:

HI @Keunhoi_An, are you using the ParameterList

No, I didn’t use ParameterList. I just store my bbox_xywh and score to torch::Tensor variables and combined them to a single TensorList. This is my case. May I ask you what is your case struggling with TensorList ?