F.relu and nn.ReLU

Hi,
in pytorch lightning, recently I found that if I use F.Dropout in forward step, even when I set mode to model.val, the dropout still work, then I realize I should replace it with nn.Dropout as module attribute. After that, everything performs normally.

  1. Then my concern would be like, if I use F.relu in forward instead of nn.ReLU as attribute, can the model performs normally in backward, i.e. considering there is a relu or relu just work in forward?
  2. and if I want to use twice nn.ReLu, should I define one self.relu=nn.ReLU and use self.relu in 2 different position or should I define 2, i.e. self.relu1 = nn.ReLU, self.relu2 = nn.ReLU

So basically I am not very clear how the backpropagation find which module( or function) it should consider.

Hi Shawn!

F.relu() (which is to say torch.nn.functional.relu()) is a
function. nn.ReLU (torch.nn.ReLU) is a class that simply calls
F.relu(). These two ways of packaging the function do the same
thing, including when calling .backward().

There is no need to instantiate two instances of the nn.ReLU function
object (but you can if you want). An instance of nn.ReLU doesn’t
contain any state, so whether you have two instances or only one,
they all do the same thing, simply calling F.relu(). (My preference
would be to instantiate only one nn.Relu function object, or if I didn’t
need an object instance, simple call F.relu(). But it really doesn’t
matter.)

Best.

K. Frank

Thank you! It is very clear.