nn.LogSoftmax() equation

As far as I know, CrossEntropy() equals to
torch.mean(torch.sum(-target * nn.LogSoftmax(input), dim=1))

How can I express nn.LogSoftmax in equation?

nn.LogSoftmax equals to torch.mean(torch.sum(F.softmax(input)*torch.log(F.softmax(input)),1) ?

import torch
import torch.nn as nn
import torch.nn.functional as F


x = torch.rand(4, 5) * 100
y_1 = nn.LogSoftmax(dim=-1)(x)
y_2 = torch.log(F.softmax(x, dim=-1))

print((y_1 - y_2).abs().max())  # 3e-8

See here.