How the dim parameter of the Softmax() is reflected in CrossEntropyLoss()

Hello,

I am new to PyTorch, and I encountered a quesiton about Softmax() and CrossEntropyLoss().

In a multi-classification task, I set dim=1 in Softmax(). I wanna know if I need to set the similar parameter in CrossEntropyLoss(). However, I did not find the similar parameter as dim in softmax(). And how the dim parameter of the Softmax() is reflected in CrossEntropyLoss()?

Thanks!

One of my guesses is that “dim” is set to a default value. For example, the default value is 1.

nn.CrossEntropyLoss expects the class dimension in dim1 as explained here.