Set parameter gradient to zero in C++

Hi, I am new to C++ Pytorch.
I want to set the parameters’ gradient to zero. My code is like:

torch::Tensor a; // 2x2 matrix
torch::Tensor b; // 2x1 vector
b.requires_grad_(true);
torch::Tensor c = a.mm(b); // 2x1 vector

c[0] .backward();
b.grad.data.zero_();

the error is:

error: ‘b.at::Tensor::grad’ does not have class type
     b.grad.data.zero_();

I couldn’t find the function which set the gradient to zero in C++.
How can I achieve b.grad.data.zero_() same as in python?

Thank you.

If you train with an optimizer in your code, you can zero the gradients through the optimizer by:

optimizer.zero_grad();

Thank you Lin_Jia.
I did it using an optimizer.

std::vector<torch::Tensor> para;
para.push_back(b);
torch::optim::Adam optim(para);

c[0].backward();
optim.zero_grad();