Take softmax of tensor

What is the best way to apply softmax to a tensor?

Before pytorch 0.3 I used to do

torch.nn.functional.softmax(tensor).data

But now it complains that “tensor” has to be a variable. Of course I could do
torch.nn.functional.softmax(Variable(tensor), dim=1).data
but it’s pretty ugly.

Is there a particular reason why softmax does not accept tensor in input? If tensors in input -> tensor, if variable in input -> variable.

1 Like

in 0.4 there will be no difference between Variable and Tensor. Until then, sorr for the ugliness.

1 Like