Well, I am not sure whether I understand the tutorila right or not. In the AUTOGRAD: AUTOMATIC DIFFERENTIATION tutorial, it says that " If you set its attribute .requires_grad
as True
, it starts to track all operations on it." However, in the later tutorial, it says " You can also stop autograd from tracking history on Tensors with .requires_grad=True
either by wrapping the code block in with torch.no_grad():
" To stop autograd from tracking history on tensors, don’t we should set the requires_grad = False? Is it a conflict?
1 Like
No, you won’t be able to set requires_grad=False
for any non-leaf tensor and will get the following error:
RuntimeError: you can only change requires_grad flags of leaf variables. If you want to use a computed variable in a subgraph that doesn't require differentiation use var_no_grad = var.detach().
Use the torch.no_grad()
block, which will make sure that Autograd won’t track any operations in this block, or detach()
the tensor as described in the tutorial.
1 Like