Hello all, I have a paired image such as img1, img2
. I want to apply the same transform during training for these images as
transform = transforms.Compose([
transforms.RandomHorizontalFlip(),
transforms.ToTensor(),
]
img1 = transform(img1)
img2 = transform(img2)
Is it possible to do it in the data loader of pytorch?
Kushaj
(Kushajveer Singh)
November 11, 2020, 8:31pm
2
You can concatenate the images along the channel dim and then apply the transform. You can check kornia , it does exactly what you want.
ptrblck
November 12, 2020, 10:26am
3
Alternatively, you could also use the functional API from torchvision
as given in this example .
1 Like
Does the approach work on distributed training?
ptrblck
November 12, 2020, 8:40pm
5
If you are using multiple processes, each process would still apply the same transformations on the data and target. The transformations will be different for each sample (in a single process and in a distributed setup).
Could you explain a bit more, what your use case is and what you would like to do?
Thanks @ptrblck . We use 16 workers and the current dataloader is
train_transform = transforms.Compose([
transforms.RandomHorizontalFlip(),
transforms.ToTensor(),
transforms.Normalize(mean = RGB_MEAN,
std = RGB_STD),])
dataset_train = FaceDataset(DATA_ROOT, RECORD_DIR, train_transform)
train_sampler = torch.utils.data.distributed.DistributedSampler(dataset_train)
train_loader = torch.utils.data.DataLoader(dataset_train, batch_size=batch_size, shuffle = (train_sampler is None), num_workers=workers, pin_memory=True, sampler=train_sampler, drop_last=True)
SAMPLE_NUMS = dataset_train.get_sample_num_of_each_class()
NUM_CLASS = len(train_loader.dataset.classes)
And the current dataloader is refered from https://github.com/HuangYG123/CurricularFace/blob/8b2f47318117995aa05490c05b455b113489917e/dataset/datasets.py#L92
path, target = self.imgs[index]
sample = Image.open(path)
sample = sample.convert("RGB")
if self.transform is not None:
sample = self.transform(sample)
Is it fine to replace your customized transformation for the above code?
ptrblck
November 12, 2020, 9:02pm
7
Yes, it should be alright.
Note that using multiple workers is not a distributed setup, which we define as using multiple devices or nodes.