How to get the probability of prediction in the transfer learning

Hi I am using the transfer learning tutorial for my classification (two labels) with the following link: https://pytorch.org/tutorials/beginner/transfer_learning_tutorial.html.

except data loader part, The code about mode is only changed with following code which change the class numbers:

model_ft = models.resnet18(pretrained=True)
num_ftrs = model_ft.fc.in_features
model_ft.fc = nn.Linear(num_ftrs, 2)

I found the prediction of outputs in the following code:

 with torch.set_grad_enabled(phase == 'train'):
                    outputs = model(inputs)
                    print(outputs.shape, outputs)
                    _, preds = torch.max(outputs, 1)

The output like the following result:

torch.Size([32, 2]) tensor([[ 3.1194, -1.4058],
        [ 1.4559, -1.3360],
        [ 1.9011, -1.0287],
        [ 0.8083, -1.0037],
        [ 0.7831, -1.1790],
        [ 0.7990, -1.3758],

I found that the number is not probability of prediction like softmax function in the keras output .

Now I am using 5-folds to predict the test data, and I want to get the probability of each folds
,and average of the sum all folds probabilities.

I don’t know how to change the code for my goal of getting probability of prediction.

This tutorial uses torch.nn.CrossEntropyLoss() as the loss function, which means that you should just run torch.nn.functional.softmax on your outputs, since it learned log-softmax values.

1 Like

thanks for your answer.