Intel openVINO cpu extension LogSoftmax

If you are using nn.LogSoftmax + nn.NLLLoss for training, you can switch to nn.CrossEntropy which combines the two (during training) as discussed in this post. Then, add a plain nn.functional.softmax to your model output during evaluation mode (since this operation is supported by OpenVino)

import torch
import torch.nn as nn
import torch.nn.functional as F

class Net(nn.Module):

    def __init__(self, num_hidden, num_classes):
        super(Net, self).__init__()
        self.fc = nn.Linear(num_hidden, num_classes)

    def forward(self, input):
        output = self.fc(input)
        if not self.training:
            output = F.softmax(output, dim=1)
        return output