I have a dataset whose labels are of shape [torch.FloatTensor of size 316x2] for a one hot encoding with 2 classes. My neural network has output layer of 2 neurons. When I tried to use criterion = nn.CrossEntropyLoss() I get this error Expected object of type Variable[torch.LongTensor] but found type Variable[torch.FloatTensor] for argument #1 ‘target’
How can I fix this thanks.
You should use a
LongTensor with the class indices instead of a one-hot format. For two classes the values should thus be
Thank you for your reply sir. I have a few more questions. I have 2 classes in my dataset. Their labels are currently [0, 1] and [1, 0].
- Do you mean the label of each class would be an integer e.g. 0 or 1 ?
- And does that mean that the last layer of my neural network have only 1 neuron ?
- So does that mean that I should not apply a softmax function prior at the output ?
EDIT: I made sure my labels are now categorical i.e. 0,1. I only have 1 neuron at the output layer. I receive this error. Assertion `cur_target >= 0 && cur_target < n_classes’ failed. at c:\miniconda2\conda-bld\pytorch-cpu_1519448892730\work\torch\lib\thnn\generic/ClassNLLCriterion.c:87
hello im still unable to fix this error
No, your last layer should return the number of classes. So it would have dimensions
[batch_size, classes]. The target vector should only contain the class indices.
It is working now thank you very much sir !