Suppose I have an n-dimensional tensor and would like to apply LogSoftmax on a given dimension.
Currently I am unable to do so. Although the documentation (http://pytorch.org/docs/master/_modules/torch/nn/modules/activation.html#LogSoftmax ) states that dim can be passed to the constructor, but when I try to do something like
soft_max = nn.LogSoftmax(3)
,
I get an error saying 'TypeError: __init__() takes exactly 1 argument (2 given)'
So, am I doing something wrong, or is the feature currently un-implemented? Also, what is the default dimension on which the softmax is applied?