Cross entropy Tensor Dimensions

Hello!!
I am having trouble figuring out what is wrong in here! I am having this error on the dimensions of the tensors on the Cross Entropy Loss (a classic, I know :stuck_out_tongue: ).

criterion = nn.CrossEntropyLoss()
classification_loss = criterion(y_logit.unsqueeze(1), y.long())

IndexError: Target 1 is out of bounds.

The dimensions and the tensors are the following (batch_size = 10):

y_logit  -> 
tensor([1.0267, 0.0967, 1.1793, 1.0542, 1.4097, 1.5651, 1.5124, 1.2934, 1.9106, 0.9233])
y_logit shape -> torch.Size([10, 1])

y -> tensor([1., 0., 1., 1., 0., 1., 1., 0., 1., 0.])
y shape -> torch.Size([10])

The y_logits is the output of a linear layer, without any sigmoid of softmax applied. Not sure if needed tho! and I converted it into tensor with the following line, before running the criterion: y_logit = torch.from_numpy(y_logit)

Does anyone know the answer of this?
Thank you in advance!

Update: I had to change to nn.BCEWithLogitsLoss(), since it is not multi-class classification.