Problem in NN, I am a beginner

class Classifier(nn.Module):

def __init__(self):
    super().__init__()
    
    self.layer1 = nn.Linear(20,17)
    self.layer2 = nn.Linear(17,14)
    self.layer3 = nn.Linear(14,10)
    self.layer4 = nn.Linear(10,8)
    self.layer5 = nn.Linear(8,4)

def forward(self,x):
    
    x = F.relu(self.layer1(x))
    x = F.relu(self.layer2(x))
    x = F.relu(self.layer3(x))
    x = F.relu(self.layer4(x))
    x = F.softmax(self.layer5(x))
    
    return x
  • Here training code for NN

model = Classifier()

optimizer = optim.SGD(model.parameters(), lr=0.005)

criterian = nn.CrossEntropyLoss()

train_data = train_data.float()
train_lable = train_label.float()

epoch = 500
for i in range(epoch):

optimizer.zero_grad()
output = model(train_data)
loss = criterian(output,train_label)
loss.backward()
optimizer.step()

print('At ',i,'/50 epoch loss is: ',loss.item())

I have this code for my NN and having very bad results after learning. I think there is some problem with calculating loss. I am using softmax regression for multiclass classification. Dataset is fine and well organized as it is preprocessed and available on Kaggle.
I would be great if anyone will help me here.

i have output as class either 0, 1, 2 or 3.
what changes need to be done here?

Hello Ikram!

For starters, get rid of the final softmax() activation and pass the
result of your last linear layer directly to CrossEntropyLoss.

As mentioned (but not really emphasized) in the CrossEntropyLoss
documentation, CrossEntropyLoss expects raw-score logits
(rather than the probabilities that are produced by softmax()).

Best.

K. Frank

1 Like

Thank you so much KFrank,
It worked.