Multiclass classification

Is my code for getting the predicted class correct? I don’t understand how softmax is being used here in cross entropy to get the class with the highest probability

for i, (x,y) in enumerate(zip(feature_trainloader,label_trainloader), 0):

                optimizer.zero_grad()   

                output = self.forward(x)

                target = y

                loss = self.CrossEntropyLoss(output, target)

                epoch_loss_train.append(loss.item())

                y_true_train = torch.cat((y_true_train, target))

                _, pred_class = torch.max(output, dim=1)

                y_pred_train = torch.cat((y_pred_train, pred_class))

                loss.backward()

                optimizer.step()

nn.CrossEntropyLoss will apply F.log_Softmax and nn.NLLLoss internally to calculate the loss, which is why you shouldn’t apply a softmax activation on the model outputs.
To get the predictions you can apply torch.argmax directly on the logits, since the order won’t change compared to the probabilities (the highest/lowest logit will also be the highest/lowest probability).

1 Like

You cant use nn.CrossEntropyLoss with softmax in pytorch.
use as per details given by @ptrblck .

If you really need softmax probabilities at the end then you can get using torch.exp(output)

and if you really want to train with softmax function than use custom cross entropy loss.
something like

def custom_categorical_cross_entropy(y_pred, y_true):
y_pred = torch.clamp(y_pred, 1e-9, 1 - 1e-9)
return -(y_true * torch.log(y_pred)).sum(dim=1).mean()

Thanks. So that means my way of getting pred_class is correct right?