How do I to use softmax?

the torch.nn.functional.softmax require the input which must have two dimensions . But now, I have a input has three dimensions(0, 1, 2). I want to softmax this input at dimension 2.
for example:
s
Variable containing:
(0 ,.,.) =
5 1 1 1
(1 ,.,.) =
1 1 1 1
(2 ,.,.) =
1 1 1 1
[torch.cuda.FloatTensor of size 3x1x4 (GPU 0)]

after softmax(s):

F.softmax(s)
Variable containing:
(0 ,.,.) =
0.9647 0.3333 0.3333 0.3333
(1 ,.,.) =
0.0177 0.3333 0.3333 0.3333
(2 ,.,.) =
0.0177 0.3333 0.3333 0.3333
[torch.cuda.FloatTensor of size 3x1x4 (GPU 0)]

this result is not what I want, Because it softmax the input at dimension 0.
How can I do to softmax the input in dimension 2? In other words, I have a multiple dimensions input, and want to softmax the input at the dimension wich can be specified.

1 Like

@apaszke @smth Thanks


I try to run the code given by ypxie, but don’t get the correct answer.