Torch.distributions.kl.kl_divergence outputs negative values

Dear all,

I am trying to compute the KL Divergence between 2 Multivariate Gaussians.

My Code is as follows:

from torch.distributions.multivariate_normal import MultivariateNormal
p = MultivariateNormal(mu1,  torch.diag_embed(std1))
q = MultivariateNormal(mu2, torch.diag_embed(std2))              
kld = ( torch.distributions.kl.kl_divergence(p, q) +
 torch.distributions.kl.kl_divergence(q, p) )

where mu1, mu2, std1, std2 are positive vectors of the same length.

A priori all seems to be well programmed but the output of the KLD is sometimes negative.

Can you please help me with this issue?

Environment details:
python: 3.8
pytorch: 1.6.0

The KL function for full-rank MVN is rather involved, so it is perhaps a numerical issue. Try using distributions.Independent to wrap univariate gaussians. Maybe avoiding tiny std values will also work.