Model Outputting the same value for every training value

I am a newbie to Pytorch and I’m trying to build a cat dog classifier by training a model with the VGG16 architecture. However, when I run my model, it doesn’t really learn anything and always outputs either all ones or all zeros in any given batch in the test set which are a mix of cat and dog.

I have been stuck on this for a while, if anyone could give me any help. It would be much appreciated.

Code:

https://colab.research.google.com/drive/1WbixZzZYiRPASMVlqbV3br8_axVipHpG

Please find it here. Thanks a lot in advance.

nn.CrossEntropyLoss expects raw logits as the model output, since internally F.log_softmax and nn.NLLLoss will be applied.
Remove the F.softmax(x,dim=1) in the forward method of your model and rerun the code.
Let me know, if this helps.

Hi,

Thanks for getting back. Still having issues, the loss on the training set is descreasing so seems as if the NN is learning but still on the test set, predicts all the same class for every member of the batch. Something weird still going on.

In that case your model might just be overfitting.
Does the validation loss go down at the beginning of your training and stays after a certain amount of epochs constant or goes up again?
You might need to play around with some hyperparameters or add regularization (e.g. via dropout).