Why fnl_loss is not working

here i have a bug.if i replace F.nll_loss function by nn.MSE class it works but why?fll_loss should have worked too.(for nll_loss i am modifying target values(you can see it in this example), but for MSE it is not required). please help me why F.nll_loss is not working, and help me to use it in this code(there i am training model cats vs dogs above i flattened image 50503 to grayscale and used neural network (without convolutions), activation function=relu).

What exactly is not working using F.nll_loss? Is your model not learning or do you get some errors?
Could you post the shapes of your model output and the target?

whenever i use F.nll_loss, everything is fine it works very vell, but during test set it outputs the same class, it predicts 1 (dogs =1 cats=0) all time.at firs all images are 50x50x3 then iconvert them to 50x50x1, target values at firs are vectors [1,0] =dog [0,1]=cat. then i convert them to single number.also int network i am using softmax function

F.nll_loss expects the model output to be log probabilities, so you should use F.log_softmax as your last activation.
Also, the target tensor should contain the class index only (have a look at the docs for an example), while it seems you are passing one-hot encoded targets.
Usually this should raise an Exception, so I’m not sure, why the code runs.

Could you post a minimal code snippet to see, how you are using this criterion and what model you are using?

thanks it works with log_softmax, it was problem.it runs if i encode this.thaks a lot.