Can the result of grad be negative?

Here is the code. I want to make the x iterated to zero.
When I set the initial value as positive, every thing goes right. However, when the x is negative at the beginning, the x eventually becomes -inf.
I found that the the result of grad is always positive in the code. However, in math, the slope of x**2 at x = -1 should be negative. (Correct typo on 8/8)
I have no idea how to solve the problem, and wonder someone can help me.

import torch
x = torch.tensor([-1.0])
x.requires_grad_(True)
rate = 0.1
N = 10
for e in range(N):
    loss_fn = x**2
    loss_fn.backward(x)
    with torch.no_grad():
        delta = x.grad.item()
        x = x - rate * x.grad
    x.requires_grad_(True)
    

Replace the line loss_fn.backward(x) with loss_fn.backward(). After that change the optimization should behave as expected. To understand the difference between these two calls, please check the autograd tutorial (it contains both examples).

This statement does not hold. Derivative of x**2 is 2*x so at x = 1 it’s value is 2 * 1 = 1.

1 Like

Thx a lot.

Yeah, you’re right, that’s the typo. Many thanks!

However, in math, the slope of x**2 at x = 1 should be negative.