For the training of my project I use two models, which means I have to outputs.
They use the following loss_fn:
I am a bit confused about their needed tensor_shapes.
Just wanted to reassure:
loss_A_fn will get:
predictedTensor: [BATCH, VALUE] like [8,1]
labelTensor: [VALUE, VALUE, …] like 
1) Is that right?
I think this points into this direction: Looking for a cross entropy loss that accepts two tensors of the same shape
But my loss_B_fn will need on both [8,1] and [8,1] (pred, label).
2) Is that right?
Since my labels are all
hot(?) Tensors  I reshape it with:
distance_label_dl = distance_label_dl.reshape(distance_label_dl.shape, 1)
3) Is that okay, or shall I rather use another (better) way?
4) Do I have to do this within my
train_loop or shall I do that within the