Cannot assign 'torch.nn.modules.container.Sequential' as parameter

I was following this method
(Dynamic parameter declaration in forward function) to dynamically assign parameters in forward function.

However, my parameter is not just one single weight tensor but it is nn.Sequential .

When I implement below:

class MyModule(nn.Module):
    def __init__(self):
        # you need to register the parameter names earlier
        self.register_parameter('W_di', None)

    def forward(self, input):
        if self.W_di is None:
            self.W_di = nn.Sequential(
                nn.Linear(mL_n * 2, 1024),
                nn.ReLU(),
                nn.Linear(1024, self.hS)).to(device)

I get the following error.

 TypeError: cannot assign 'torch.nn.modules.container.Sequential' as parameter 'W_di' (torch.nn.Parameter or None expected)

Is there any way that I can register nn.Sequential as a whole param? Thanks!

You could register the nn.Sequential container directly using self.W_di = nn.Sequential(...), which will thus register all internal parameters.
I’m not sure, if I understand the use case correctly, but if you need to get only this subset of parameters, you could call:

model.W_di.parameters()

Would that work for you or do you need to handle these parameters somehow differently?

1 Like

Hi ptrblck, thanks for the reply.
As you see in the forward function self.W_di = nn.Sequential(...) is how I assigned nn.Sequential to self.W_di. However, when the forward is called, I’m getting an error saying

TypeError: cannot assign 'torch.nn.modules.container.Sequential' as parameter 'W_di' (torch.nn.Parameter or None expected)

I’m not trying to call parameters nor a subset of params. Just trying to run the training but got this error.

pytorch 1.1.0

In your __init__ you are using:

self.register_parameter('W_di', None)

which will create this error later.
If you don’t need W_di as an nn.Parameter, you could just remove it.

1 Like

But what if I need to declare it as a parameter? Maybe I wasn’t explaining it fully here but I’m trying to assign parameter in foward function following this approach Dynamic parameter declaration in forward function

In that post, the Chief Crazy Person suggested using self.register_parameter.

Ah OK, I clearly misunderstood the use case.
This code should work as the suggested one by Adam :wink:

class MyModel(nn.Module):
    def __init__(self):
        super(MyModel, self).__init__()
        self.submodule = None
        
    def forward(self, x):
        if self.submodule is None:
            self.submodule = nn.Sequential(
                nn.Linear(1, 1),
                nn.ReLU()
            )
        x = self.submodule(x)
        return x

model = MyModel()
model(torch.randn(1, 1))
print(dict(model.named_parameters()))
> {'submodule.0.weight': Parameter containing:
tensor([[-0.1282]], requires_grad=True), 'submodule.0.bias': Parameter containing:
tensor([-0.7143], requires_grad=True)}