Hi,
The result of (model(x)>.5).float()
is a binary tensor. Gradients cannot exist for non-continuous values. You might want to use tanh()
instead (see this thread for a similar issue Step Activation Function)
Hi,
The result of (model(x)>.5).float()
is a binary tensor. Gradients cannot exist for non-continuous values. You might want to use tanh()
instead (see this thread for a similar issue Step Activation Function)