As understood from the topic, I want to implement cross entropy with softmax on logits with target of format [batchsize,C,H,W] where values are in the range [0,1). Values across axes 1 does not sum to 1.

torch.nn.functional.cross_entropy does not allow 4D targets.

The crossEntropy is used for classification problems where the length of output of ur network is the number of classes u have
So if u pass an image through the NN it outputs a length of C values where C=classes. So the shape of this output is (N, C) N= batch size, C is number of classes what crossentropy does it to then find the probability score of the outputs with softmax and then compute the negative log likelyhood between the scored output and the Target.

So in other words crossEntropy only take values with shape(N, C).

@Henry_Chibueze, The tensorflow version of cross entropy: tf.nn.softmax_cross_entropy_with_logits allows me to pass 4D target with continuous values by passing axes argument representing class. I am trying to find pytorch version of this.

Hi @KFrank, my target values do not sum to 1, that is, they are not soft-labels. Values along class axis are normalized gaussian pulses with values between 0 to 1.

Thatâ€™s okay. You can still use the â€śsoft-label cross-entropyâ€ť I linked to
above â€“ the math still goes through. It wonâ€™t â€“ precisely speaking â€“
be a true cross-entropy (which compares two true probability
distributions), but it will give you a reasonable loss function.

(And, to be sure, if you pass your targets that donâ€™t sum to one to tf.nn.softmax_cross_entropy_with_logits(), nothing changes,
and you will still have a the same issue.)