Conv1D outputs NaNs

I have a network which outputs NaNs after some epochs.
It looks like this:

self.filter = nn.Sequential(

The input vector is valid, doesn’t contain any NaNs.

It used to work fine with the following loss function:

distrib = torch.distributions.MultivariateNormal(y, torch.eye(y.size()[0]*sigma)
loss = -distrib.log_prob(x)

but I modified it so that I could compare different models with different sigmas:

distrib = torch.distributions.MultivariateNormal(y, torch.eye(y.size()[0])
loss = -(distrib.log_prob(x)/distrib.log_prob(y))

I use Adam Optimizer, SGD doesn’t work either. What is wrong ?

Fixed it:
You can’t just normalize the log_prob, you have to exponentiate it first:

loss = -(distrib.log_prob(x).exp() / distrib.log_prob(y).exp()).log()

Works fine now !

Leaving the thread in case it can be of use to anyone.