Set parameter gradient to zero in C++

Hi, I am new to C++ Pytorch.
I want to set the parameters’ gradient to zero. My code is like:

torch::Tensor a; // 2x2 matrix
torch::Tensor b; // 2x1 vector
torch::Tensor c =; // 2x1 vector

c[0] .backward();;

the error is:

error: ‘’ does not have class type;

I couldn’t find the function which set the gradient to zero in C++.
How can I achieve same as in python?

Thank you.

If you train with an optimizer in your code, you can zero the gradients through the optimizer by:


Thank you Lin_Jia.
I did it using an optimizer.

std::vector<torch::Tensor> para;
torch::optim::Adam optim(para);