RuntimeError: Inplace operation and .backward()

Hello guys!

I’m getting the following error when calling .backward():

Encounter the RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation

for i, j, k in zip(X, Y, Z):
    A[:, i, j] = A[:, i, j] + k

X and Y are leaf tensors. Z is a non-leaf tensor resulting from the previous layer.
X, Y pairs are non-unique: Ex. (3, 9, 0.5), (3, 9, 0.4) should result in (3, 9, 0.9)

I’ve tried .clone(), torch.add(), torch.sum(), and so on.

Please help! :roll_eyes:

Hi @Natasha_667 ,

If you don’t absolutely need these operations to be in-place, this should work:

B = A.clone()
for i, j, k in zip(X, Y, Z):
    B[:, i, j] = A[:, i, j] + k

or this what you are referring to when you say “I’ve tried .clone()”?

1 Like

Thanks Serge! :slight_smile: I was using it wrong :frowning: