However, my parameter is not just one single weight tensor but it is nn.Sequential .
When I implement below:
class MyModule(nn.Module):
def __init__(self):
# you need to register the parameter names earlier
self.register_parameter('W_di', None)
def forward(self, input):
if self.W_di is None:
self.W_di = nn.Sequential(
nn.Linear(mL_n * 2, 1024),
nn.ReLU(),
nn.Linear(1024, self.hS)).to(device)
I get the following error.
TypeError: cannot assign 'torch.nn.modules.container.Sequential' as parameter 'W_di' (torch.nn.Parameter or None expected)
Is there any way that I can register nn.Sequential as a whole param? Thanks!
You could register the nn.Sequential container directly using self.W_di = nn.Sequential(...), which will thus register all internal parameters.
I’m not sure, if I understand the use case correctly, but if you need to get only this subset of parameters, you could call:
model.W_di.parameters()
Would that work for you or do you need to handle these parameters somehow differently?
Hi ptrblck, thanks for the reply.
As you see in the forward function self.W_di = nn.Sequential(...) is how I assigned nn.Sequential to self.W_di. However, when the forward is called, I’m getting an error saying
TypeError: cannot assign 'torch.nn.modules.container.Sequential' as parameter 'W_di' (torch.nn.Parameter or None expected)
I’m not trying to call parameters nor a subset of params. Just trying to run the training but got this error.
But what if I need to declare it as a parameter? Maybe I wasn’t explaining it fully here but I’m trying to assign parameter in foward function following this approach Dynamic parameter declaration in forward function
In that post, the Chief Crazy Person suggested using self.register_parameter.