KL distance between Gaussian Mixture Model and Normal

I am wondering whether there is any similar class in pytorch as expectation(tensorflow) which can be used for computing KL divergence between the probabilities which do not have builtin KL divergences modules inside the torch.distribution class and possibily they can be computed with Monte Carlo sampling ( KL(p||q) = \int p(x) log(p(x) / q(x)) dx = E_p[ log(p(x) / q(x))). Some of the KL divergences also don’t have closed form formulations such as the KL divergence between two Gaussian mixtures (MixtureSameFamily) or a Gaussian mixture and a normal distribution (Normal)?