I am implementing a variational autoencoder. The input has both positive and negative numbers and it is NOT between 0 and 1. I did a log2 transform before training the model. I was wondering if binary_cross_entropy is a good choice in this case? because after the first iteration it gives me the following error:
RuntimeError: Assertion `x >= 0. && x <= 1.' failed. input value should be between 0~1, but got nan at /Users/distiller/project/conda/conda-bld/pytorch_1556653492823/work/aten/src/THNN/generic/BCECriterion.c:60
I’m not sure if your transformation returns output in [0, 1], but the error points to a NaN value.
Could you try to check, why the output contains a NaN?
That’s strange, since F.sigmoid should only yield values in the range [0, 1], so that the clamp operation shouldn’t change anything.
Are you sure the clamp fixed this issue?