Same transform for a paired image?

Hello all, I have a paired image such as img1, img2. I want to apply the same transform during training for these images as

transform = transforms.Compose([
           transforms.RandomHorizontalFlip(),
           transforms.ToTensor(),
]
img1 = transform(img1)
img2 = transform(img2)

Is it possible to do it in the data loader of pytorch?

You can concatenate the images along the channel dim and then apply the transform. You can check kornia, it does exactly what you want.

Alternatively, you could also use the functional API from torchvision as given in this example.

1 Like

Does the approach work on distributed training?

If you are using multiple processes, each process would still apply the same transformations on the data and target. The transformations will be different for each sample (in a single process and in a distributed setup).
Could you explain a bit more, what your use case is and what you would like to do?

Thanks @ptrblck. We use 16 workers and the current dataloader is

 train_transform = transforms.Compose([
           transforms.RandomHorizontalFlip(),
           transforms.ToTensor(),
           transforms.Normalize(mean = RGB_MEAN,
                            std = RGB_STD),])
    dataset_train = FaceDataset(DATA_ROOT, RECORD_DIR, train_transform)
    train_sampler = torch.utils.data.distributed.DistributedSampler(dataset_train)
    train_loader = torch.utils.data.DataLoader(dataset_train, batch_size=batch_size, shuffle = (train_sampler is None), num_workers=workers, pin_memory=True, sampler=train_sampler, drop_last=True)
    SAMPLE_NUMS = dataset_train.get_sample_num_of_each_class()
    NUM_CLASS = len(train_loader.dataset.classes)
 

And the current dataloader is refered from https://github.com/HuangYG123/CurricularFace/blob/8b2f47318117995aa05490c05b455b113489917e/dataset/datasets.py#L92

path, target = self.imgs[index]
sample = Image.open(path)
sample = sample.convert("RGB")
if self.transform is not None:
     sample = self.transform(sample)

Is it fine to replace your customized transformation for the above code?

Yes, it should be alright.
Note that using multiple workers is not a distributed setup, which we define as using multiple devices or nodes.