For the training of my project I use two models, which means I have to outputs.
They use the following loss_fn:
ModelA: nn.CrossEntropyLoss()
ModelB: nn.SmoothL1Loss(reduction='mean')
I am a bit confused about their needed tensor_shapes.
Just wanted to reassure:
loss_A_fn will get:
predictedTensor: [BATCH, VALUE] like [8,1]
labelTensor: [VALUE, VALUE, …] like [8]
1) Is that right?
I think this points into this direction: Looking for a cross entropy loss that accepts two tensors of the same shape
But my loss_B_fn will need on both [8,1] and [8,1] (pred, label).
2) Is that right?
If yes:
Since my labels are all hot
(?) Tensors [8] I reshape it with:
distance_label_dl = distance_label_dl.reshape(distance_label_dl.shape[0], 1)
3) Is that okay, or shall I rather use another (better) way?
4) Do I have to do this within my train_loop
or shall I do that within the CustomDataset
?