nn.LogSoftmax(dim=1)

Hello

I am trying this exercise from pytorch tutorial.
https://pytorch.org/tutorials/intermediate/char_rnn_classification_tutorial.html#creating-the-network

I have a question about self.softmax = nn.LogSoftmax(dim=1).
Could someone explain to me why dim=1???

Thanks

PyTorch layers accept batched inputs where often the dimensions represent [batch_size, features, ...]. dim1 is therefore used to represent the number of classes in a classification use case. Applying a log_softmax on this dimension transforms logits to log probabilities and normalizes them over the class dimension.

1 Like

thank a lot for your answer, sir.