I’m transitioning to 0.4 and I love the new syntax!
I have a bit of problem understanding how to use
with torch.set_grad_enabled(True): to initialize tensors with grad enabled:
In : with torch.set_grad_enabled(True):
...: X = torch.tensor(10)
...: y = X * 10
In : X.requires_grad
In : y.requires_grad
Shouldn’t both be
I think the with: block restricts the setting to only apply within the block.
You can set requires_grad for a tensor with
X.requires_grad = True
Ah ok … makes sense - thanks!
If you care for a bit of unsolicited advice: Note that
X.reqires_grad = True will silently not do the expected while
X.reqires_grad_() will tell you about your spelling. For that reason (and for saying
y = x.detach().requires_grad_() and similar more compactly) I have moved to use the function exclusively.