RuntimeError: Expected isFloatingType(grads[i].scalar_type()) to be true, but got false

inputs = Variable(inputs,requires_grad=False)
labels = Variable(labels,requires_grad=False)

logits = model.forward(inputs)
outputs= loss.forward(softmax(logits).argmax(axis=1).float(),labels.float())
optimizer.zero_grad()
outputs.backward()
optimizer.step()

I have images as an input and corresponding binary labels to the CNN but I am facing the error as

RuntimeError: Expected isFloatingType(grads[i].scalar_type()) to be true, but got false. (Could this error message be improved? If so, please report an enhancement request to PyTorch.) (validate_outputs at …\torch\csrc\autograd\engine.cpp:476)
(no backtrace available)

I have used BCEloss function and Adam as optimizer. Used Softmax for the output of BCEloss
Any idea to fix it?

There are some issues in the code:

  • Variables are deprecated since PyTorch 0.4, so you can use tensors now.
  • Don’t call .forward, but the module directly via: model(inputs).
  • nn.BCELoss is used for a binary classification (or multi-label classification) and expects a sigmoid as the last non-linearity. We generally recommend to use raw logits + nn.BCEWithLogitsLoss for better numerical stability.
  • argmax will create an output of LongTensors, which will break the computation graph, since LongTensors cannot have gradients. It’s also not needed to call argmax on the output before passing it to the criterion.