requires_grad=False after

The variable ‘inputs’ is a output of net.The variable will be inputed to loss function.However,the attribute of requires_grad was changed into ‘False’ after I executed the following:inputs =
If this case,can it execute backpropagation in the network normally?If not,why? How can I solved it? Thanks in advance.

No, you won’t be able to calculate the gradients after the operation, since it’s not differentiable and would thus break the computation graph.
You could use a “soft” function such as e.g. sigmoid instead.

1 Like