Equivalent of TensorFlow’s Softmax_Cross_Entropy_With _Logits in Pytorch vision

My model outputs is a logits_is_obj_present.
Now to calculate the loss I have to use the equivalent of Softmax_Cross_Entropy_With _Logits or Softmax_Cross_Entropy
Please suggest

1 Like

I don’t really understand TensorFlow’s softmax_cross_entropy_with_logits. It could be similar to http://pytorch.org/docs/master/nn.html#torch.nn.CrossEntropyLoss. Please let me know if this is what you’re looking for.