F.cross entropy vs torch.nn.Cross_Entropy_Loss

What is the difference between pytorch F.cross entropy vs torch.nn.Cross_Entropy_Loss?
As far as I understand torch.nn.Cross_Entropy_Loss is calling F.cross entropy

8 Likes

Hi,

There isn’t much difference for losses.
The main difference between the nn.functional.xxx and the nn.Xxx is that one has a state and one does not.
This means that for a linear layer for example, if you use the functional version, you will need to handle the weights yourself (including passing them to the optimizer or moving them to the gpu) while the nn.Xxx version will do all of that for you with .parameters() or .to(device).

For loss functions, as no parameters are needed (in general), you won’t find much difference. Except for example, if you use cross entropy with some weighting between your classes, using the nn.CrossEntropyLoss() module, you will give your weights only once while creating the module and then use it. If you were using the functional version, you will need to pass the weights every single time you will use it.

39 Likes

Super clear, thanks so much.

Can you tell me what do you mean by passing the wieights manually?
I mean if i use lossFn=nn.CrossEntropyLoss()
then the code i use is:

optimiser = optim.SGD(net.parameters(), lr = 0.01, momentum=0.9)
loss = lossFn(output,actual)
optimiser.zero_grad()
loss.backward()
optimiser.step()

What differently do i have to do with the functionall one?
Thanks in advance

You are not using a weighted cross entropy. So no difference as mentionned above.

3 Likes