For loop over encoder

Dear Community,

I was wondering whether one can for loop over some encoder network (e.g resnet 18 without the last fully connected layer) and afterwards concatenate the embedding. Eventually there is a set of fully connect layers at the end to classify the batch entities. Here is some example code.

    # x.shape = (bs, i, c, h, w) where "i" is some stack of images for each batch entity
    embs = []
    for i in range(int(x.shape()[1])):
        embs.append(self.encoder(x[:, i]))

    # concatenate the embeddings
    x = torch.cat(embs, 1)
    x = self.dense_layer1(x)
    x = self.dense_layer2(x)
    x = self.classifier(x)  # last dense layer

Can the autograd backpropogate the loss or does anyone see some mistakes?
Would be thankfull for any kind of feedback.

Best,
Paul

You can avoid the for loop.

x = x.view(bs*i, -1)
out = model(x)
out.view(bs, i, -1)

True. Seems to be a better approach. Thou it appears that for loop does the trick as well. But I guess it should run faster if we have a single forward pass.

1 Like

For loop will run slower. You can time it. In both the cases, the computation is almost the same, but in case of for loop, you have python loop overhead and list extension overhead.