F.relu is the functional interface of torch.nn.ReLU. Modules like torch.nn.ReLU are sometimes handy, for example when quickly creating a model using nn.Sequential.
You can’t add F.relu in a nn.Sequential, as it expects an object that is inherited from nn.Module.
About clamp, it is a tensor/variable function which is more generic than ReLU, and it works for both Tensor and Variable, while F.relu only works for variables. Also, ReLU is a common enough module to deserve it’s own function, instead of having to write x.clamp(min=0).
Yes. The functional interface is very handy when you want to perform some more complex operations.
For example, let’s say that you want the weights of your convolution to be the output of some network (for example in hyper networks.
In this case, you can’t use the Module interface, as it creates the weights during initialization, but you can easily use the functional interface for that
weights = net1(input)
res = F.conv2d(input, weights)