ValueError: Expected target size different

Hi , everyone. I try so hard to solve my issue but it seems that I am wasting 4 hours and can not get a result. I try to train a RNN model using GRU to classify three different language in wav file. I still can’t find the reason for the error.

loss_func = nn.NLLLoss()
learning_rate = 0.01
optimizer = torch.optim.Adam(model.parameters(), lr=learning_rate)

num_epochs = 25
model.train()

for epoch in range(num_epochs):
    train_loss = 0
    
    for x, y in train_dataloader:
        x = x.to(device)
        y = y.to(device)
        
        yhat = model(x)
        print(yhat.shape)
        
        loss = loss_func(yhat,y)
        
        model.zero_grad()
        loss.backward()
        optimizer.step()
        
        train_loss += loss
    
    if not (epoch % 1):
        print(f'Epoch: {epoch+1:02d}, ' +
              f'Loss: {train_loss / len(train_dataloader.dataset):.4f}')

print('Finished Training')

The shape of the output yhat is (60,1000,3) and using the softmax activation. The shape of the target y is (60,1000,1).

It shows that the error is ValueError: Expected target size (60, 3), got torch.Size([60, 1000, 1])

Is there something wrong with my labels?

I’m not familiar with your use case and thus don’t know what the dimensions mean in your output.
However, nn.NLLLoss (and nn.CrossEntropyLoss) expect a model output in the shape [batch_size, nb_classes, *] and a target in [batch_size, *] containing class indices in the range [0, nb_classes-1].
Note that the * stands for additional dimensions.

In your case it seems that you are using [batch_size=60, nb_classes=1000, seq_len=3] in your output.
If that’s the case, the target should have the shape [60, 3] and contain values in [0, 999].

nn.NLLLoss expects log probabilities, so use F.log_softmax instead.

Thank you for the response, it helps a lot.