VAE not getting trained

Hi,
I am trying to generate samples through VAE, but after training, I can see a blob at the center of the image. I tried mse loss, bce loss but still, the problem persists.
The loss is shooting up and becomes nan values after some epoch.
I tried with mse loss

recons_loss =F.mse_loss(recons, input,reduction='mean')
kld_loss = torch.mean(-0.5 * torch.sum(1 + log_var - mu ** 2 - log_var.exp(), dim = 1), dim = 0)

and also with bce loss

recons_loss = F.binary_cross_entropy_with_logits(recons, input, size_average=False)
        kld_loss = -0.5 * torch.sum(1 + log_var - mu.pow(2) - log_var.exp()).mean()

Please let me know what is the issue here?

kaggle notebook link: Pokemon | Kaggle

Hello!
Looks like you have exploding gradient. I got same problem with my project - NaN instead of values. This is my loss

reconstruction_loss = self.loss(output, y)
kl_divergence = -0.5 * torch.sum(1 + logvar - mu.pow(2) - logvar.exp())
loss = reconstruction_loss + kl_divergence

Solution was to limit sigma (dispertion):

self.max_logvar = nn.Parameter(torch.ones(latent_size) * 0.5, requires_grad=True)
self.min_logvar = nn.Parameter(torch.ones(latent_size) * -0.5, requires_grad=True)

def reparameterize(self, mu, logvar):
        logvar = torch.clamp(logvar, min=self.min_logvar, max=self.max_logvar) ###
        std = torch.exp(0.5*logvar)
        eps = torch.randn_like(std)
        z = mu + eps*std
        return z