Shared parameters with other nested class

Hi there!

I have a model consisting of two parts: an embedding model and a graph convolutional neural network.
I now want to add the parameters of both model parts to the optimizer (torch.optim.Adam), but this seems to be a bit tricky, since my embedding model is different for different input types. Therefore I constructed a dictionary, mapping the input type to the embedding model in self.multi_modal_embed.

So my question is how can I access the parameters of the embedding model MultimodalEmbeddingLayer ?

The base class BaseModel contains both the embedding model and the convolution model:

class BaseModel(nn.Module):
    """The base class for the graph convolutional neural network."""
    def __init__(self,
        super(BaseModel, self).__init__()
        # create a dictionary mapping the input type to the corresponding embedding layer structure 
        self.multi_modal_embed = {
            input_type: MultiModalEmbeddingLayer(
            for input_type in self.input_types

        # graph convolution model layer list 
        self.convolution_layers: nn.ModuleList = nn.ModuleList()

In the following you can see the embedding model MultiModalEmbeddingLayer.
The embedding model has an attribute self.embedding_layers in which the layers are stored. It is created depending on the input type. The embedding model itself is defined as follows:

class MultiModalEmbeddingLayer(nn.Module):
    """Embedding Layer for multi-modal encoder ."""
    def __init__(
        super(MultiModalEmbeddingLayer, self).__init__()
        # module list of different layers for embedding
        self.embedding_layers: th.nn.ModuleList = th.nn.ModuleList()
        # Build unimodal layers
        # Build multimodal layer

        def build_unimodal_layer(self):
        """Build the first unimodal layer(s) for each modality and encode the features. """
        # in case of 1 modality
        if self.num_modalities == 1:

            # create layer
            layer_1 = self.build_layer(...)

            # append layers to embedding layer modulelist

        # in case of 2 modalities
        elif self.num_modalities == 2:
            # create layer
            layer_1 = self.build_layer(...)
            layer_2 = self.build_layer(...)

            # append layers to embedding layer modulelist


There is a type called ModuleDict that is meant for this purpose so pytorch can find parameters inside dicts. It basically acts exactly like a dict

@ChickenTarm Thanks for the reply. I tried using the torch.nn.ModuleDict, but it did not work since the class instances of MultiModalEmbeddingLayer are not layers themselfves. This class has an attribute in which all the layers are stored.

In your self.build_layer(…), idk what is inside but for the parameters that exist there, are they registered as nn.Parameter?