What's the difference between `torch.nn.functional` and `torch.nn`?

Personally, I think creating activation, dropout, pooling etc. modules in __init__ makes it easier to reuse the model. For example, when extracting features, you may wish to wrap a pretrained model and overwrite the forward function to return the feature variables required. Having these modules let you conveniently do this instead of inserting many functional calls.

But using functional interfaces we may do some fancy operations like convolving two feature maps explicitly with F.conv2d.

11 Likes