Thers is a question about cross_entropy


When I use the loss function cross_entropy, there is a erro:
Traceback (most recent call last):
** File “”, line 131, in **
** optimizer.step()**

RuntimeError: cuda runtime error (59) : device-side assert triggered at /opt/conda/conda-bld/pytorch_1512386481460/work/torch/lib/THC/generated/…/generic/

I add a sigmoid function before the net output,that can correctly run.But my input value is not only 0~1 , I use autoencoder,so I think the output by add sigmoid is not close the input

Why? How can I correctly use the cross_entropy?

The CrossEntropyLoss is used for classification with class labels.
I think it might be not the right criterion for your autoencoder, since the labels will most likely be floats.

As a workaround you could use nn.MSELoss as a criterion.
Would this work for you?

nn.L1Loss is good than nn.MSELoss(),but I think nn.L1Loss is not the best loss function for autoencoder

so, for CrossEntropyLoss ,the labels can not be floats? Then it must be what type value?

The targets should be of dtype=torch.long for CrossEntropyLoss.

Thank you for your help