requires_grad=False after inputs.ge(0.5)

The variable ‘inputs’ is a output of net.The variable will be inputed to loss function.However,the attribute of requires_grad was changed into ‘False’ after I executed the following:inputs = inputs.ge(0.5).float()
If this case,can it execute backpropagation in the network normally?If not,why? How can I solved it? Thanks in advance.

No, you won’t be able to calculate the gradients after the tensor.ge operation, since it’s not differentiable and would thus break the computation graph.
You could use a “soft” function such as e.g. sigmoid instead.

1 Like