Pytorch-geometric NNConv Layer

Hello! I’m using pytorch-geometric module and I’ve a small model

class DGN(torch.nn.Module):
    def __init__(self, MODEL_PARAMS):
        super(DGN, self).__init__()
        self.model_params = MODEL_PARAMS
        
        nn = Sequential(Linear(self.model_params["Linear1"]["in"], self.model_params["Linear1"]["out"]), ReLU())
        self.conv1 = NNConv(self.model_params["conv1"]["in"], self.model_params["conv1"]["out"], nn, aggr='mean')
        
        nn = Sequential(Linear(self.model_params["Linear2"]["in"], self.model_params["Linear2"]["out"]), ReLU())
        self.conv2 = NNConv(self.model_params["conv2"]["in"], self.model_params["conv2"]["out"], nn, aggr='mean')
        
        nn = Sequential(Linear(self.model_params["Linear3"]["in"], self.model_params["Linear3"]["out"]), ReLU())
        self.conv3 = NNConv(self.model_params["conv3"]["in"], self.model_params["conv3"]["out"], nn, aggr='mean')
        
        
    def forward(self, data):
        """
            Args:
                data (Object): data object consist of three parts x, edge_attr, and edge_index.
                                This object can be produced by using helper.cast_data function
                        x: Node features with shape [number_of_nodes, 1] (Simply set to vector of ones since we dont have any)
                        edge_attr: Edge features with shape [number_of_edges, number_of_views]
                        edge_index: Graph connectivities with shape [2, number_of_edges] (COO format) 
                        
        """
        x, edge_attr, edge_index = data.x, data.edge_attr, data.edge_index
        
        x = F.relu(self.conv1(x, edge_index, edge_attr))
        
        x = F.relu(self.conv2(x, edge_index, edge_attr))
        
        x = F.relu(self.conv3(x, edge_index, edge_attr))
        
        repeated_out = x.repeat(self.model_params["N_ROIs"],1,1)
        repeated_t   =  torch.transpose(repeated_out, 0, 1)
        diff = torch.abs(repeated_out - repeated_t)
        cbt = torch.sum(diff, 2)
        
        return cbt

I will combine this model with federated learning approach. So what I need to do is having 3 model instances from the class above and to access every weight variables in each layer in each model and then average those weight variables across same variables.

We normally know that, for example, nn.Linear module has two weight variables such as weight and bias, stated in its documentation page under Variables section:
htt ps://py torch.o rg/d ocs/stable/generated/torch.nn.Linear.ht ml (Since I’m allowed to use 2 links, I seperated the link. Sorry for that!
However we don’t have any section like that for NNConv module:
https://pytorch-geometric.readthedocs.io/en/latest/modules/nn.html#torch_geometric.nn.conv.NNConv
So I scanned its source code:
https://pytorch-geometric.readthedocs.io/en/latest/_modules/torch_geometric/nn/conv/nn_conv.html#NNConv
Then I decided I have four weight variables:
weight and bias variables inside nn Sequential module
root and bias variables of NNConv module.
Then I averaged those 4 variables across those variables in each model. The results I got, however, are not good. So I’m trying to understand where I did wrong and one of the possibilities is there may be other weight variable in this layer or I’m using a variable that doesn’t have any weights in it, so averaging desn’t help model to converge. This is my question:
If we need to access all weight variables of NNConv module, what are the variables we should need to access?
Thanks!