Hi, all.
As far as I know, 1st parameter in `nn.KLDivLoss() ` need to be `F.log_softmax()`.
(In my case, there is no softmax in classifier.)

``````class Classifier(nn.Module):
def __init__(self, channel, classes=10):
super(Classifier, self).__init__()
self.fc = nn.Linear(channel, classes)

def forward(self, x):
x = self.fc(x)
return x
``````
``````pred1 = classifier(input)
pred2 = classifier(input)
1) kld_loss = nn.KLDivLoss()(F.log_softmax(pred1, dim=1), pred2)
2) kld_loss = nn.KLDivLoss()(F.log_softmax(pred1, dim=1), F.softmax(pred2, dim=1))
``````

Based on the docs, the second argument should be passed as probabilities:

As with `NLLLoss` , the input given is expected to contain log-probabilities and is not restricted to a 2D Tensor. The targets are given as probabilities (i.e. without taking the logarithm).