Deactivate backward

Hi,

What is the difference between these two API:

torch::NoGradGuard guard;
torch::autograd::GradMode::set_enabled(false);

Can I use them interchangeably to change weights of a network?

Thanks,
Afshin

torch::NoGradGuard guard; is an RAII guard while torch::autograd::GradMode::set_enabled(false); is not. Consider an example:

auto t = torch::randn({2, 2});

{
  torch::NoGradGuard guard;  // This only takes effect within the scope
  t.add_(1);  // Does not accumulate gradient
}
t.add_(1);  // Accumulates gradient

{
  torch::autograd::GradMode::set_enabled(false);  // This is a global setting
  t.add_(1);  // Does not accumulate gradient
}
t.add_(1);  // Still doesn't accumulate gradient
2 Likes

Thanks for the explanation. So, if I add torch::autograd::GradMode::set_enabled(true); after the operation, does it activate the gradient accumulation?

{
torch::autograd::GradMode::set_enabled(false); // This is a global setting

t.add_(1); // Does not accumulate gradient

torch::autograd::GradMode::set_enabled(true);
}

@afshin67 Yes it does :slight_smile:

1 Like