Embarassingly simple question about autograd


I just wrote the following code and I am not getting the answer that I expected.

y = torch.tensor([10], dtype=torch.float32, requires_grad=True)
w = (np.pi*y)/180
f = torch.sin(w)

It prints ¸None. Shouldn’t the answer be 0.984807753012208 ? Here is my understanding. we have $f(w) = sin(w)$ where $w=\frac{\pi*y}{180}$. w.grad should give us $\frac{\partial f}{w}\big|_{w=0.1745329}$. However, I am not getting that.

Could someone please help me understand this ?

Thank you!


When you call .backward(), then .grad field of only the leaf variables is populated (to reduce memory usage).
A leaf variable is one that the user created with requires_grad=True. In your case, y.is_leaf == True but w.is_leaf == False.

If you want the .grad field to be populated for a Tensor that is not a leaf, you need to call w.retain_grad() on it before the backward pass.

1 Like

@albanD Ah thank you! I understand your comment much better now after reading this github issue