The Pytorch documentation says: torch.nn.functional.softmax(input, dim=None, _stacklevel=3)
But when I run: torch.nn.functional.softmax(a, dim=1), I gen the following error: TypeError: softmax() got an unexpected keyword argument 'dim'
The newest released version is still 0.2. conda update will only give you that version. Should you want this functionality, you may build from source yourself (which is quite easy actually). Let me know if you have other questions
Hmm it is added somewhere between 0.2 and master. So the minimum version you need is still on github only. Therefore, you need to build from source no matter what. I’d just build from master
Presumably I could just install python-pytorch-git from the AUR rather than building from source? (The reason I don’t like building from source is that you don’t get the automatic updates)
Thinking about it, it would probable be more sensible just to swap the axes using .permute before and after applying softmax, rather than risking breaking things by updating to the latest version