Hi everyone!

I’m trying to reproduce a model originally programmed in TensorFlow, but I’m using PyTorch.

My doubt is: Which is the equivalent TF softmax_cross_entropy() function in PyTorch?

Thanks in advance!

Hi everyone!

I’m trying to reproduce a model originally programmed in TensorFlow, but I’m using PyTorch.

My doubt is: Which is the equivalent TF softmax_cross_entropy() function in PyTorch?

Thanks in advance!

Hi Deep!

In short, there is no *direct* equivalent, so you have to write your own.

A quick glance at the tensorflow documentation suggests that

`tf.losses.softmax_cross_entropy()`

has been deprecated

in favor of `tf.nn.softmax_cross_entropy_with_logits()`

,

but that both of these take `labels`

of shape `[nBatch, nClass]`

that are probabilities (sometimes called “soft labels”).

In contrast, pytorch’s `torch.nn.CrossEntropyLoss`

(or its function

version `torch.nn.functional.cross_entropy()`

) takes integer

class labels of shape `[nBatch]`

.

(Both the tensorflow and pytorch versions take *logits*, rather than

probabilities, for the predictions you pass is, so that part’s the same.)

If your problem uses, in fact, integer class labels, just use pytorch’s

`CrossEntropyLoss`

. But if your labels are probabilities, then you

can write your own “soft-label” version, as described here:

Good luck.

K. Frank

2 Likes