Typo in KL divergence documentation?

hi
I just found the documentation of KL is


but what i remember about KL divergence is log(y/x) instead of logy - x. so what is the real implementation in pytorch?