Mini-batching if for-loop involved in forward operation

Hi all,

I am experimenting on a attention autoencoder which does the following:

  1. Each input is tuples of array (e.g. Tuples of length 10, among which each array has a size of 90)
  2. The encoder encodes each array from size 90 to size 30
  3. The encoded are concatenate together to form a array of size 10 x 30 = 300)
  4. The attention layer compress the encoded_combined to a size of 30 and see which array (among the 10 arrays) to pay attention to
  5. The decoder decode the array of size 30 to a array of 90

Hence, in forward, a for-loop has to be incorporated the encoded of each of the array in the tuple::

    def forward(self, X):
        x = torch.from_numpy(X[0]).view(1,-1,self.in_features)
        encoded_combined = self.encoder(x)
        encoded_combined = torch.squeeze(encoded_combined)
        for arr in X[1:]:
            x = torch.from_numpy(arr).view(1,-1,self.in_features)
            tail = self.encoder(x)
            tail = torch.squeeze(tail)
            encoded_combined = torch.cat((encoded_combined,tail))
        attn_w = F.softmax(self.attn(encoded_combined), dim = 0)
        encoded_combined = encoded_combined.view(-1, 1)
        attn_applied = torch.mm(attn_w_expanded, encoded_combined)
        decoded = self.decoder(attn_applied.view(1,-1,30))
        return decoded, attn_w

The above forward operation only can take 1 sample at a time and the training process is very slow.

.Is there any way to achieve mini-batching / get rid of the for-loop for such dataset?

Currently I have a collate_fcn as follows which take 1 sample at a time so I do not need to disable automatic batching of the dataloader.

def collate_fcn_one_sample(data):
    data_batch = [batch[0] for batch in data]
    target_batch = [batch[1] for batch in data]
    for group in data_batch:
        for data in group:
            data = torch.from_numpy(data)
    for target in target_batch:
        target = torch.from_numpy(target)
    return group, target

Much obliged for your help!!!

Many thanks.