I upgraded to master (0.4.0a0+b608ea9
). I thought the following will work after the merge. Should we still use torch.autograd.Variable
explicitly from Python?
I’m trying the following:
import torch
torch.nn.functional.softmax(torch.Tensor(2, 3, 4, 5), dim = 1)
#Traceback (most recent call last):
# File "<stdin>", line 1, in <module>
# File ".../lib/python2.7/site-#packages/torch/nn/functional.py", line 840, in softmax
# return torch._C._nn.softmax(input, dim)
#TypeError: softmax(): argument 'input' (position 1) must be Variable, not torch.FloatTensor