It won’t produce the same loss, as the default reduction in nn.CrossEntropyLoss
calculates the mean loss value for the batch.
If you set reduction='sum'
, you should get the same loss.
However, if you need the loss for each batch, just disable the reduction via reduction='none'
(related topic).
1 Like