Difference between set_grad_enabled(False) and no_grad()

Just a quick question: the following are exactly equivalent, correct?

with torch.no_grad():
block of code

with torch.set_grad_enabled(False):
block of code

It seems there is no difference: https://stackoverflow.com/a/53447634/6888630

There is no difference.
Just that one takes a bool as input and not the other.
Also torch.set_grad_enabled(False) can be used as just a function to set the grad mode forever.

1 Like

Great, thanks a lot. Very convenient to use the same piece of code for training and evaluating.