Hi there,
Are there any issues around adding layers and objects in general to self
in a class based on nn.Module
and then not actually using them in a forward call. E.g. batch norm in this contrived example
def MyModule(nn.Module):
def __init__(self, use_bn=True):
super(MyModule, self).__init__()
self.use_bn = use_bn
self.conv = nn.Conv1d(...)
self.bn = nn.BatchNorm1d(...)
def forward(self, x)
x = self.conv(x)
if self.use_bn:
x = self.bn(x)
return x
My understanding is that this is safe (if wasteful) as if the call to self.bn()
is not included in the forward call, it’s not in the graph so will have no impact on optimisation. likewise with arbiraty object attached to self
. Is that the case? or are there some modules/layers where this will have a negative impact?