One of the variables needed for gradient computation has been modified by an inplace-operation

I am running into this error ‘One of the variables needed for gradient computation has been modified by an inplace-operation’ when back propagating on my loss function. My loss function is a simple

def loss_fn(x):

    total_loss = []
    for i in range(k)
        loss = ...
        total_loss.append(loss)

    final_loss = torch.sum(torch.stack(total_loss))

I can call loss.backward() at any given iteration of the loop, but cannot call final_loss.backward() indicating the for loop is breaking it. I do not see an inplace operation, and I have tried solutions from other questions asked.

Hi,

This is quite surprising indeed.
Can you share a small code sample that reproduces this?

1 Like

Hi,

I found my error, I had a variable assignment in on of my helper functions for calculating the loss. Thanks!

1 Like