KL-divergence for Multivariate Gaussian VAE

How to calculate KL-divergence in multivariate Gaussian distribution is the following code correct?

        mu, logvar = self.encode(x.view(-1, 2))
        z = self.reparameterize(mu, logvar)
        #q0 = torch.distributions.normal.Normal(mu, (0.5 * logvar).exp())
        q0 = MultivariateNormal(mu, (0.5 * logvar).exp())
       # prior = torch.distributions.normal.Normal(0., 1.)
        prior = MultivariateNormal(mu,logvar)
        log_prior_z = prior.log_prob(z).sum(-1)
        log_q_z = q0.log_prob(z).sum(-1)
        KL = log_q_z - log_prior_z