Constant loss after each epoch

Hello,

I have the following question. I am training a convolutional neural network on CIFAR-10 dataset. Here is my structure of the fully connected layer

  1. Dropout(0.
  2. Linear layer
    3.Relu
    4.Dropout(0.5)
    5… softmax

I have observed that when I print the loss( cross entropy with L2 regularization) at each epoch, my loss is a constant equal 2.3036. However, when I remove the relu and softmax activations, and replace (5) with a linear layer, this is not the case.

In either case my network does not learn and I am getting very low accuracies on train set.

Which loss function are you using?
If you are using e.g. nn.CrossEntropyLoss, you should pass the logits directly without applying softmax on them.
Make sure to check the docs and pass the output in the expected value range to your criterion.

Great. Thank you so much