What is the best way to apply softmax to a tensor?

Before pytorch 0.3 I used to do

`torch.nn.functional.softmax(tensor).data`

But now it complains that “tensor” has to be a variable. Of course I could do

`torch.nn.functional.softmax(Variable(tensor), dim=1).data`

but it’s pretty ugly.

Is there a particular reason why softmax does not accept tensor in input? If tensors in input -> tensor, if variable in input -> variable.