Is there a way to globally disable autograd?

This might be a stupid question but here it goes…In my Module, I would like to calculate and update the gradients myself. Is there a way I can disable autograd globally rather than setting requires_grad = False for every tensor?

You can use:

with torch.no_grad():
    your code here.

Additionally to @ebarsoum’s solution, you could also use torch.autograd.set_grad_enabled(mode) to globally disable the gradient calculation.

1 Like