Is log_softmax + NLLLoss == CrossEntropyLoss?

Could you elaborate on “log_softmax gives different results depending on shape”? I’ve printed the shapes and they look the same.