Hello all, In caffe I used the SoftmaxWithLoss
for multiple class segmentation problem
(Caffe) block (n) → BatchNorm → ReLU → SoftmaxWithLoss
Which loss in pytorch should I used to achieve similar result in caffe? Thanks
Hello all, In caffe I used the SoftmaxWithLoss
for multiple class segmentation problem
(Caffe) block (n) → BatchNorm → ReLU → SoftmaxWithLoss
Which loss in pytorch should I used to achieve similar result in caffe? Thanks
I think that would be
import torch.nn.functional as F
F.cross_entropy()
or the equivalent (object-oriented API)
torch.nn.CrossEntropyLoss
These take the logits as inputs and compute the log softmax and pass it to the neg log likelihood loss (/multinomial logistic loss) function internally.