Can't specify dim in functional.softmax

The Pytorch documentation says: torch.nn.functional.softmax(input, dim=None, _stacklevel=3)
But when I run: torch.nn.functional.softmax(a, dim=1), I gen the following error: TypeError: softmax() got an unexpected keyword argument 'dim'

Is this a bug or am I doing something wrong?

1 Like

You are probably using 0.2 or 0.1.12. The kwarg is added after 0.2 release. You can check the 0.2 docs at

I ran conda update pytorch and restarted Jupyter, but I’m still getting the error. Any idea why this could be?

The newest released version is still 0.2. conda update will only give you that version. Should you want this functionality, you may build from source yourself (which is quite easy actually). Let me know if you have other questions :slight_smile:

Thanks for your help. Do you know what’s the minimum version I need?

Hmm it is added somewhere between 0.2 and master. So the minimum version you need is still on github only. Therefore, you need to build from source no matter what. I’d just build from master :slight_smile:

Presumably I could just install python-pytorch-git from the AUR rather than building from source? (The reason I don’t like building from source is that you don’t get the automatic updates)

That is not maintained by our team. Upon a quick glance, it doesn’t seem get updated very frequently. So I’m not sure. But it never hurts to try.

Thinking about it, it would probable be more sensible just to swap the axes using .permute before and after applying softmax, rather than risking breaking things by updating to the latest version :slight_smile:

1 Like

Sounds good if that works for you! We did have a pretty large softmax refactoring a while ago :slight_smile:

Hello, i have face the same problem. Can i ask how to build from source? I am using conda which doesn’t work with torch 0.4. Thanks

What OS are you using?
Do you get any error while trying to install using conda?
You can find the install instructions here.

1 Like

conda certain works with 0.4, and even 0.4.1 and newer.

Thanks. I solved it.

excuse me,how to solve this problem by not updating the pytorch version

It is not supported in 0.2 and before.

thanks for your help, i can not update the pytorch ,but i really need this functional, how i can alter this code ?