Cloning one tensor to another without inplace operatinos

Hello! I need to optimize a custom loss function, but I can’t use loss.backward() because of “inplace operations”. In my code, I need this part:

G[i] = rewards[i] + G[i + 1] * gamma

where G and rewards are float torch tensors with enabled require_grads flag. After this, loss function is counted by loss = - torch.sum(lprob4act * G) / len(rewards), where lprob4act is also a torch tensor with require_grads=True that is counted by the model. But when I try to call loss.backward(), program crashes and interpreter reports that some values required by autograd were modificated using inplace operations. When I delete this part of my code it doesn’t crash. What is inplace operation here, and how to fix it? Thanks.

Hi,

If G is a Tensor, then G[i] =xxx is an inplace operation on G.
If you never need G as a full Tensor, you may want to make it a plain python list to avoid the inplace operation: you will change the list but no Tensor inplace and so it will work.