Crossentropy dimension out of range error

my data is
x => [1,0,0,0,0,0,1,0,0,2…]
y => [1,0]

my model is like below:

class Net(nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        self.l1 = nn.Linear(input_len, 200)
        self.l2 = nn.Linear(200, 200)
        self.l3 = nn.Linear(200, 2)
        
    def forward(self, x):
        x = F.relu(self.l1(x))
        x = F.relu(self.l2(x))
        return self.l3(x)
model = Net()

criterion = nn.CrossEntropyLoss()
optimizer = optim.Adam(model.parameters(), lr=0.0001)

and i try to test my model like this

output = model(X_train[0])
print(output, Y_train[0])
loss = criterion(output, Y_train[0])

but there is error output:

tensor([0.5214, 0.4851], grad_fn=<SigmoidBackward>) tensor([1., 0.], grad_fn=<SelectBackward>)
RuntimeError: Dimension out of range (expected to be in range of [-1, 0], but got 1)

how can i handle this error?

Hi,

All models and loss functions assume that the first dimension is the batch size. So the first input to cross entropy should be batch_size x nb_classes and the second batch_size (and contain values between 0 and nb_classes-1).

1 Like