I guess it is because there are some tasks to add parameters to given module. Here is an example code. (Below code is just example, but I did this kind of thing when I implemented spectral normalization layer)
import torch
import torch.nn as nn
class Test(nn.Module):
def __init__(self, module):
super(Test, self).__init__()
self.module = module
self.register_param()
def register_param():
exist_w = hasattr(self.module, 'w')
if not exist_w:
w = nn.Parameter(torch.ones(1))
self.module.register_parameter(w) # register 'w' to module
def forward(self, x)
return x
conv = nn.Conv2d(3, 3)
conv_w = Test(conv)
nn.Module.register_parameter takes the tensor or None but first checks if the name is in dictionary of the module. While nn.Parameter doesn’t have such check.