Pytorch for loop inefficience

Hello

I have the next code and im wondering how i could implement somehow the last loop. Here comes an explanation of the code

features_list = torch.empty( (0, 1000), dtype=torch.float ).cuda()

Firstly we create an empty tensor of 1000 colummns (output of VGG16 conv_layer5_3).

Batch size of the dataloader = 8

    counter = 0                                                     
    model.cuda()
    for i, data in enumerate(dataloader, 0):
        #Extraccion de Tensores
        input, label = data                                             
        input, label = input.to(device), label.to(device)              
        n,c ,h,w = input.size()                                         
        outputs = model(input)   

this outputs torch tensor has dims: torch.Size([8, 1000])

        if (i == 0):                                                    
            features_list = torch.cat( (features_list, outputs[0].view(1,-1)), 0)

For the very first iteration we add the first tensor that comes in the mini-batch.

        dist_tensores = torch.cdist(outputs, features_list, p=2.0) 

This distance dims depends on the iteration but firstly are:
1º torch.Size([8, 1])
2º torch.Size([8, 6])
3º torch.Size([8, 11])

AVG = 60
        activation = torch.gt(dist_tensores, AVG, out=torch.cuda.FloatTensor(len(outputs), len(features_list)))

Depends on the iteration, always [8, X] being X the number of tensors in feature_list

        counter = len(features_list)
        activation_list = torch.sum(activation, dim=0)

        for x in range(len(activation)):
          if (torch.sum(activation[x], dim=0) == counter):
            features_list = torch.cat( (features_list, outputs[x].view(1,-1)), 0)

If the add of all the positions in dim 0 equals the number of rows of feature_list we proceed to add the tensor via torch.cat .

My issue is that i dont know how to add the tensor outputs[x]` if im not ussing a loop. I think there are “for loops” created for this or maybe its just a vectorial comparision.

Double post from here.

1 Like