Softmax and CrossEntropyLoss

Hi, when I use nn.CrossEntrypyLoss(), is it necessary to define the nn.Softmax as the last Layer by Forward or is it in CrossEntropyLoss included?

For example: Which should be the correct version:
def forward(self, x):
x = F.relu(self.conv1(x))
x = F.relu(self.conv2(x))
x = F.relu(self.conv3(x))
x = self.avgpool(x)
x = self.fc1(x)
x = nn.Softmax(x, dim=1)
return x

or:

def forward(self, x):
x = F.relu(self.conv1(x))
x = F.relu(self.conv2(x))
x = F.relu(self.conv3(x))
x = self.avgpool(x)
x = self.fc1(x)
return x

nn.Softmax is not required in this case. Your second example should work fine.

1 Like