If your problem is related to the presence of NaNs, I think you could:
use an if statement to avoid x == 1, you could set a smaller value for x, for instance x = 0.98;
before returning the value, you could check the presence of NaNs: you could create a variable function = torch.where(x > 0, x, x / (1 - x)), then use torch.nan_to_num(function, nan = value).
I don’t know if this solutions are the best, but I guess they’re worth trying.
Let me know if you’re able to solve the problem
Solution is to use Masking() layers available in keras with mask_value=0 . This is because when using empty vectors they are calculated into the loss, by using Masking() , as outlined by keras the padding vectors are skipped and not included.