@albanD In that case, is it proper to use the for
loop for indexing the torch.tensor
as in the forward
function ?
I wrote it before learning that in-place operations should be avoided in PyTorch. And it is stated in this reply that the assignment of a value to an index is an in-place operation, so my implementation does not seem correct to me for that reason. But given that the forward
and backward
functions are working as expected, it might be correct, so I am confused. Any elaboration would be helpful.