Does BCE and CrossEntropyLoss in pytorch already include activation function like Sigmoid and Softmax?

Hi, I am curious with classification loss in pyotrch, I have seen tutorial online some of them apply activation function like softmax and sigmoid for CrossEntropy and BCE loss repectively to convert an outpuy to probabilites but some do not. So does both loss function in pytorch already inlclude those activation function inside it ?? because when I check the pytorch Gtihub repository , the comment on Crossentropy function written like this >>>
>>> # Example of target with class probabilities
>>> input = torch.randn(3, 5, requires_grad=True)
>>> target = torch.randn(3, 5).softmax(dim=1)
>>> output = loss(input, target)
>>> output.backward()

Thank you

Hi,

For torch.nn.CrossEntropyLoss, from the docs:

Note that this case is equivalent to the combination of LogSoftmax and NLLLoss.

which means softmax isn’t explicitly needed and is inherently applied by PyTorch.

For torch.nn.BCELoss, a sigmoid layer is explicitly required.

Consider using torch.nn.BCEWithLogitsLoss that combines torch.nn.BCELoss and a sigmoid layer and is numerically more stable than using a plain Sigmoid followed by a BCELoss.