Here is the code. I want to make the x iterated to zero.
When I set the initial value as positive, every thing goes right. However, when the x is negative at the beginning, the x eventually becomes -inf.
I found that the the result of grad is always positive in the code. However, in math, the slope of x**2 at x = -1 should be negative. (Correct typo on 8/8)
I have no idea how to solve the problem, and wonder someone can help me.
import torch x = torch.tensor([-1.0]) x.requires_grad_(True) rate = 0.1 N = 10 for e in range(N): loss_fn = x**2 loss_fn.backward(x) with torch.no_grad(): delta = x.grad.item() x = x - rate * x.grad x.requires_grad_(True)