Extracting features from specific layers on a trained network

Hi,

I have a fully-connected CNN with the following architecture:

fcnn(
  (conv1d_block1): Sequential(
    (0): Conv1d(3, 128, kernel_size=(7,), stride=(1,), padding=(3,))
    (1): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
    (2): ReLU()
  )
  (conv1d_block2): Sequential(
    (0): Conv1d(128, 256, kernel_size=(5,), stride=(1,), padding=(2,))
    (1): BatchNorm1d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
    (2): ReLU()
  )
  (conv1d_block3): Sequential(
    (0): Conv1d(256, 128, kernel_size=(3,), stride=(1,), padding=(1,))
    (1): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
    (2): ReLU()
  )
  (global_avg_pool): AvgPool1d(kernel_size=(1800,), stride=(1800,), padding=(0,))
  (linear_1): Linear(in_features=128, out_features=1, bias=True)
  (sigmoid): Sigmoid()
)

as you can see it does a binary classification.
so by doing a forward pass I would get a proba for the binary class in regard to the batch size.
I would also like to have access to the output of the global_avg_pool in my example, meaning I would like to have the following outout: (batch_size,128), meaning a feature representation learned by the network with size of 128.

how could this be accomplished?

thanks!

You could return this activation in your forward method or alternatively use forward hooks as described here.

2 Likes

Thank you for the detailed example!

so to make sure that I understand it correctly, I need to create a specific forward hook for each layer that I want its features to be extracted?

first I do the forward pass, all the layers updated accordingly and only then I can access specific layers with their vector representations?

and there is no generic way to make a forward pass up to a specific layer?
for example:

output = model.fc2(x)

That would be one possible method.
If you need a certain activation a lot of times, it could be easier to just return it in your forward:

def forward(self, x, ret_act=False):
    x = F.relu(self.conv1(x))
    a = F.relu(self.conv2(x))
    x = F.relu(self.conv3(a))
    if ret_act:
        return x, a
    else:
        return x

If you just want the activation, you could also wrap all layers up to the particular layer in an nn.Sequential module and just call it, but you won’t get the last output using this method.

Based on your last suggestion, you could also try to simulate the forward pass, but I’m not a huge fan of this approach:

output = model.conv3(F.relu(model.conv2(F.relu(model.conv1(x)))))