Hi,
I am trying to apply a torch.sqrt to a vector that contains 0 values. My goal is to avoid the 0 values. I mean only a pply the torch.sqrt to values that are different of 0. To do that I have try the following:
t=torch.tensor([[0.0,1.0,2.0],[3.0,0.0,4.0],[5.0,6.0,0.0]])
aux = t[t!=0].clone()
aux = torch.sqrt(aux)
t[t!=0] = aux.clone()
This works, but breaks the backpropagation. I am getting the following error:
RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation: [torch.cuda.FloatTensor [7822, 1]], which is output 0 of DivBackward0, is at version 1; expected version 0 instead. Hint: the backtrace further above shows the operation that failed to compute its gradient. The variable in question was changed in there or anywhere later. Good luck!
It seems like this line aux=t[t!=0].clone()
is breaking the backpropagation. How can I solve it? Is there any way to achieve my goal easily?