Getting Intermediate Output of Self Created Sequential

I created my own sequential that has many modules in it. I was able to create a module PrintLayer and I placed it after the output i want to save.

class PrintLayer(nn.Module):
def init(self):
super(PrintLayer, self).init()

def forward(self, x):
    print(x.shape)
    return x

It could print the shape and value of the output of a specific layer i want to observe during the forward pass, but how exactly do I save that value so I could retrieve it after the forward pass. Thanks!

You could store the activations in a dict using forward hooks.
I’ve created a small example in this thread.

1 Like

The problem is my model is embedded in a very complicated way like this.

Would it be feasible to set the hook in the __init__ of some sub-module, e.g. UnetSkipConnectionBlock?
Otherwise, you would need to address your layers from top to bottom.

Yeah I can add stuff to the submodule, but I’m unsure how to add the hook there since each submodule is a Sequential and isn’t created like the example you have.

You would have to index the modules like in this example:

model = nn.Sequential(nn.Sequential(nn.Sequential(nn.Linear(10, 2))))
model[0][0][0].register_forward_hook
2 Likes

Thanks!

And then you would pass it a function like get_activation that creates a hook and saves the value in a dict?

Yes, then I would use the method I’ve linked in the other post.
Note that you might want to remove the .detach in hook, if you want to backpropagate.

I’m still having trouble accessing the modules. When i print out the children print(list(model.netG.children())), I get

, and then when I print model.netG.module( I had to use .module to get rid of dataparallel error), I get . Now when I try to index that, it says UnetGenerator doesn’t support indexing.

It seems some sub-modules have the name “model”.
Have a look at this small example, how to index the layers:

class MyModel(nn.Module):
    def __init__(self):
        super(MyModel, self).__init__()
        self.fc1 = nn.Sequential(nn.Linear(10, 10), nn.ReLU())
        
    def forward(self, x):
        return self.fc1(x)

model = MyModel()
print(model)
model.fc1[0] # get linear layer