Problem in version 0.4 with cross entropy Loss


I am experimenting an issue that on previous versions I did not have.

When computing the cross entropy error with the parameter reduce=False I got this error:

RuntimeError: expand(torch.cuda.FloatTensor{[100]}, size=[]): the number of sizes provided (0) must be greater or equal to the number of dimensions in the tensor (1)

I am evaluating torch.nn.functional.cross_entropy(out,y,reduce=False)

where y has shape (100,) and out has shape (100,10)

Can anyone give me some feedback? I have seen on other posts that people experimented this issue when calling backward on a Loss.