Creating fully connected layers inside a class, depending on various parameters

I am trying to create a general feedforward net with CNN and FNN layers. What I am trying to do is: depending on the parameters that are passed in the class, then I want to create fully connected layers according to how config is specified.

Would this be a good way of doing it inside a class EncodingNetwork(nn.Module)?

      for num_units, dropout_params, weight_decay in zip(
          fc_layer_params, dropout_layer_params, weight_decay_params):
        kernal_regularizer = None
        if weight_decay is not None:
          kernal_regularizer = ??
        layers.append(
            nn.Linear(
                num_units,
                activation=activation_fn,
                kernel_initializer=kernel_initializer,
                kernel_regularizer=kernal_regularizer,
                dtype=dtype))

Also, I am confused as to how I should specify L2 regularization in kernal_regularizer. I couldn’t find a way to do this in PyTorch, but for sure there is one.

Yes, creating the modules in the __init__ method sounds like the right approach.
Make sure to use nn.ModuleList for layers to properly register these modules or wrap the list into e.g nn.Sequential if this fits your use case.

nn.Linear doesn’t have a kernel_regularizer argument so I’m unsure where this is coming from.
You could check the Parametrizations tutorial to see if this could be useful.