Google Colab, torch.randperm issue

Hello everyone,

I have an issue when using torch.randperm in Google Colab and I cant figure out what is going wrong.

Since Im relatively new to pytorch and neural networks, I want to apologize in advance if I misuse / misunderstand some terms :wink:

Short version: The list of train images is randomly permutated before every train epoch. Interestingly the random permutations between different experiments are exactly the same for the respective epoch.
So if an experiment (experiment 1) is started, the random permutation of epoch 1might be [10485, 10999, 5432, …].
If I repeat the experiment (experiment 2) the random permutation of epoch 1 is again [10485, 10999, 5432, …].

Below you can find the code:

The function create_epoch_tuples gets called before each epoch to create the list of indices of train images to be used in the respective epoch (here that list is called self.qidxs)

def create_epoch_tuples(self, net):

    idxs2qpool = torch.randperm(len(self.qpool))[:self.qsize]
    self.qidxs = [self.qpool[i] for i in idxs2qpool]
    self.pidxs = [self.ppool[i] for i in idxs2qpool]
    ...

The images are then returned in getitem via:

def getitem(self, index):

    output = []
    # query image
    output.append(self.loader(self.images[self.qidxs[index]]))
    ...

I am opening this question since I am confused whether this behaviour of torch.randperm is β€œnormal” since I expected different permutations also between experiments.
Thanks for any help

Is it possible that some earlier code is setting a seed manually? e.g.,

import torch
torch.manual_seed(42)
torch.randperm(20)

Will give you the same results every time.

1 Like

Thanks for your answer!