nn.ReLU()
creates an nn.Module
which you can add e.g. to an nn.Sequential
model.
nn.functional.relu
on the other side is just the functional API call to the relu function, so that you can add it e.g. in your forward
method yourself.
Generally speaking it might depend on your coding style if you prefer modules for the activations or the functional calls. Personally I prefer the module approach if the activation has an internal state, e.g. PReLU
.