Entropy loss Not CrossEntropy

loss = F.softmax(x,dim=1) * F.log_softmax(x, dim=1)
loss.sum()

I want to build entropy loss (not CrossEntropy loss).
Above codes are correct to get entropy of the predictions?