Hi!
I have and autoencoder, and between the coder and decoder I transform the data using torch.sign . When I do it, the backpropagation of the gradient stops in that point.
If I replace torch.sign by torch.sigmoid, then I don’t have that problem and the backpropagation goes back to the beginning.
Do I have to do something different with torch.sign?