Hi, all.

As far as I know, 1st parameter in `nn.KLDivLoss() `

need to be `F.log_softmax()`

.

What about the 2nd parameter?

(In my case, there is no softmax in classifier.)

```
class Classifier(nn.Module):
def __init__(self, channel, classes=10):
super(Classifier, self).__init__()
self.fc = nn.Linear(channel, classes)
def forward(self, x):
x = self.fc(x)
return x
```

```
pred1 = classifier(input)
pred2 = classifier(input)
1) kld_loss = nn.KLDivLoss()(F.log_softmax(pred1, dim=1), pred2)
2) kld_loss = nn.KLDivLoss()(F.log_softmax(pred1, dim=1), F.softmax(pred2, dim=1))
```