Does model.parameters() return the parameters in topologically sorted order?

Goal: To list model parameters in the sequence of their execution during forward pass, basically from input layer to the output layer. Or in the order of their execution in computation graph.

Does doing this will guarantee that the parameters are traversed in topologically sorted order of their execution:

for name,param in model.named_parameters():
    print(name, param.shape)

No, this will print the parameters in the order as they were registered:

class MyModel(nn.Module):
    def __init__(self):
        super(MyModel, self).__init__()
        self.fc3 = nn.Linear(1, 1)
        self.fc2 = nn.Linear(1, 1)
        self.fc1 = nn.Linear(1, 1)
        
    def forward(self, x):
        x = F.relu(self.fc1(x))
        x = F.relu(self.fc2(x))
        x = self.fc3(x)
        return x

for name, param in model.named_parameters():
    print(name)

> fc3.weight
fc3.bias
fc2.weight
fc2.bias
fc1.weight
fc1.bias

You could try to register the layers in the same order as their execution order in the forward method would be (if that’s possible for your model).

1 Like

Is there any guaranteed way to traverse in topological order? Any methods ?

I’m really unsure so I’ll just post some ideas.

Since the computation graph is created during the forward pass, it might of course be different for each pass (e.g. if you are using conditions etc.).
You could try to use the grad_fn and call into grad_fn.next_functions to crawl the graph.
However, this would yield you the operations, not necessarily the layers and I’m not sure how hard it would be to create the mapping. :confused:

Hi, I face the same issue. I need to apply forward pass manually (layer by layer) without accessing the forward method. Just I have access to the class instance of the Net. Is it possible to do so?

@ptrblck Can we assume that the order by parameters() is always fixed every time we call it?

I think so, as the internal _parameters is using an OrderedDict as seen here.

1 Like