loss = F.softmax(x,dim=1) * F.log_softmax(x, dim=1)
loss.sum()
I want to build entropy loss (not CrossEntropy loss).
Above codes are correct to get entropy of the predictions?
loss = F.softmax(x,dim=1) * F.log_softmax(x, dim=1)
loss.sum()
I want to build entropy loss (not CrossEntropy loss).
Above codes are correct to get entropy of the predictions?