Add KL div as regularization to CrossEntropyLoss

hi
I use CrossEntropyLoss, and I want to add KL div between labels and prediction as regularization to the loss

how can I do it?
thanks

Check out the VAE loss_function in Pytorch’s examples here.