No gradients when backwarding with respect to repeat or expand?

I’m a little bit confused why I don’t get gradients in the following example using either repeat or expand

x = torch.randn(10, dtype=torch.float32, requires_grad=True)
x = x.repeat(5, 1)
y = (x**2).sum()
y.backward()
print(x.grad is None)
x = torch.randn(10, dtype=torch.float32, requires_grad=True)
x = x.expand(5, -1)
y = (x**2).sum()
y.backward()
print(x.grad is None)

In both cases x.grad is None.

On the other hand, this example does give you gradients

x = torch.randn(5,10, requires_grad=True)
y = (x**2).sum()
y.backward()
print(x.grad is None)

What’s going on here? Is this because x is no longer a leaf?

Hi,

the difference is that in the first two examples, you override x (and so the x you look for gradients is not the leaf Tensor that you created) while in the last one, you put the result in y directly.

2 Likes