cruzas
(Samuel)
1
Hello there!
I was wondering if it’s possible to set requires_grad = false for a model’s parameters in C++?
For instance, if I understood correctly, in Python this would be done as follows:
for param in model.parameters():
param.requires_grad = False
What is the equivalent of this in C++?
Thank you very much.
Oktai15
(Oktai Tatanov)
2
Just add the following line in your code:
torch::NoGradGuard no_grad;
cruzas
(Samuel)
3
Hi Oktai15!
Thank you for your reply!
If I understood correctly, the NoGradGuard should, in essence, emulate the behaviour of requires_grad=false, correct?
If so, no parameters should change if I have the following, right? I.e., I should see the same bias before and after?
std::cout << "Biases before:\n" << policy->affine2->bias.data() << std::endl;
{
torch::NoGradGuard no_grad;
loss.backward();
optimizer.step();
}
std::cout << "Biases after:\n" << policy->affine2->bias.data() << std::endl;
It turns out it still changed the biases, do you know why?
Oktai15
(Oktai Tatanov)
4
Hmm, try to add this line to the same scope with model and loss