Torch.distributions.kl.kl_divergence( ) raises NotImplementedError

I want to get KL-divergence from two softmax outputs.(in pytorch version is 1.1.0.)
but torch.distributions.kl.kl_divergence( ) raises NotImplementedError.
How to use this divergence?

this function expects Distribution objects, i.e. you’d need to wrap tensors with Categorical(probs=p)

1 Like