I have a tensor called grads
with stride:
(1866240, 155520, 64, 1)
and shape:
shape: torch.Size([1, 12, 2430, 64])
But, when I do a backwards pass (using a custom autograd function), like this:
result.backwards(grads)
In other words:
@staticmethod
def backward(ctx, grad_output):
The stride of grad_output
becomes (1, 155520, 64, 1)
, even though the stride of grads
is (1866240, 155520, 64, 1)
.
The stride of this tensor becomes: (1, 155520, 64, 1)
, even though the shape hasn’t changed. Why does this happen?