Your labels has one too many dimensions. It appears that
you are using a cross–entropy-type loss criterion. Pytorch’s CrossEntropyLoss (and related loss criteria) expect your outputs to have shape [nBatch, nClass] and your labels
to have shape [nBatch] (with no nClass dimension). If, for
some reason, your labels naturally has a trailing singleton
dimension, you can squeeze() it away.