Does this work? If I assign something to leaf tensor in forward() function like this

Pseudo code:

def forward(self,x):
out = torch.zeros(10,10,10,10)

out[:,:,:,col][:,:,row,:] = x # col row are indices

I feel like this is weird,because I did not operate on x itself. Instead, x is assigned to a part of another leaf tensor. Does this have a influence on backward?

Hi,

You can do this, and if the backward does not raise an error you will get the correct gradients.
What can happen though is that you get Tensor needed for backward has been changed by an inplace operation. As inplace operations can change a tensor needed during the backward pass. The engine is over-restrictive and so if it works, will always give the right answer. If it does not, you will have to remove the inplace ops. To create a vector of results, you can for example store them in a list while you create them and then cat them at the end.

1 Like

Thanks for your nice reply!!!