Normalizing flow gone wrong

Hello! I am using a normalizing flow (Neural Spline Flows) to approximate a probability and after some training, the average loss is around 0.5 (so logprob = -0.5). However, when I am trying it on some new test data, I am getting some values of logprob bigger than zero, which would mean that the probability for that element is bigger than one (which doesn’t make sense). Does anyone know what could cause this? Isn’t the flow supposed to keep all the probabilities below 1 automatically? Thank you!

1 Like

I’m not familiar with your approach, but could you explain, how the log probabilities were calculated during training?
I would assume you are using e.g. F.log_softmax to calculate them or are they generated by any other method?