Will the output be the same with or without torch.no_grad()? torch.no_grad() impacts the autograd engine and deactivates it, in order to speed up the computation. Does it also impact the results? In both cases, model.eval() is on.
torch.no_grad()
will only disable Autograd and will avoid storing the forward activations. It will not change any behavior of the model. However, if the model has built-in randomness, which cannot be disabled via model.eval()
, torch.no_grad()
also won’t change it.
1 Like