How to choose between torch.nn.Functional and torch.nn module?

I’m not sure, what you mean by “giving parameters to there operations” exactly.
If you are not sure about whether to use an nn.Module or the functional API, have a look at this longer post, where I describe my personal point of view.

Thanks for the reply,

I was assuming that nn wrapper on any operation makes the operation trainable like nn.Conv. But from the post you shared I think that in operations like maxpooling or avgpooling training these operations is not possible since they don’t have any parameters even if they are wrapped by nn class. Sorry, I messed up my basics.
Thanks for the reply.

Check this out,solution to this issue was clearly mentioned.

1 Like

Reading through this thread I saw confusion across multiple posts about when to use, nn or nn.F. This is what people reading through this read should read twice:

torch.nn is stateful. torch.nn.Functional is stateless. You have to initialize the torch.nn modules so that the state can be tracked (not sure if tracked is the correct term).

Dropout is generally considered stateful while functions like ReLU are stateless - as it does not have weights, etc to update/keep track of. If I am wrong in any manner someone please chime in.

2 Likes

torch.nn module is more used for methods which have learnable parameters.
and functional for methods which do not have learnable parameters

1 Like

Also check here, python - Pytorch: nn.Dropout vs. F.dropout - Stack Overflow