Loss does not decrease after modifying prediction with torch.floor()

Hello,
For the following code, the loss decreases.

loss_function=nn.MSELoss()
loss=loss_function(pred,label)

But, the loss remains completely unchanged if I change the pred by floor function. I checked the parameters after opt.step(), they are not changing.

loss_function=nn.MSELoss()
loss=loss_function(torch.floor(pred),label)

Why this might happen? Does it break computation graph?

If you plot the floor function, you will see that the derivative is either zero or undefined. That’s a problem to optimize with gradient descent. :wink:

Try replacing it with torch.floor(pred).detach() + pred - pred.detach().