What's difference of nn.Softmax(), nn.softmax(), nn.functional.softmax()?

nn.Softmax is an nn.Module, which can be initialized e.g. in the __init__ method of your model and used in the forward.

torch.softmax() (I assume nn.softmax is a typo, as this function is undefined) and nn.functional.softmax are equal and I would recommend to stick to nn.functional.softmax, since it’s documented. @tom gives a better answer here. :wink:

2 Likes