Is there a KL divergence operation supported by coreML conversion?

I have a nn.KLDivLoss() in my model to calculate the distance between two tensors.
However, when I was trying to convert PyTorch to coreML, I came across the following error:

  File "/mnt/miniconda/envs/PY3/lib/python3.9/site-packages/coremltools/converters/mil/frontend/torch/ops.py", line 134, in convert_single_node
    raise RuntimeError(
RuntimeError: PyTorch convert function for op 'kl_div' not implemented.

What is the alternative way of implementing this KL divergence calculation for a successful conversion? Thanks.

CoreML doesn’t directly support nn.KLDivLoss. Implement KL divergence manually using basic operations, create a custom PyTorch layer, or consider Run ADP an alternative loss function supported by CoreML. Refer to CoreML documentation and explore online resources for specific implementations.