How to turn off gradient tracking without using 'with torch.no_grad():'

Hello,

My request is pretty simple. I want to be able to turn off autograd mechanics without having to indent my entire code in a ‘with’ block. This way I can use it dynamically. I have so far tried this:

if do_grad:
   torch.enable_grad()
else:
   torch.no_grad()
...

But so far I have noticed that gradients are being tracked no matter what.

Nevermind! I found the correct function to call:

torch.set_grad_enabled(True)

and

torch.set_grad_enabled(False)

do exactly what I want.

I know I am late and you have found the solution almost two years ago but for anyone who is looking for a similar solution can check out the .detach() method .