How to make gradient flow back through torchvision.transforms

I would like to include transform mechanism within the loss function. However, it looks like that gradient won’t flow back through transforms. Is there a way to make gradient flow back through a set of torchvision.transforms? I will use transforms.RandomResizedCrop() and transforms.RandomHorizontalFlip().

kornia provides differentiable transformations, so you could check it.
CC @edgarriba

3 Likes

@ptrblck Thank you for the suggestion!!! I just wonder if torchvision.transforms.xxx are differentiable transformations since they inherit torch.nn.Module, e.g., torchvision.transforms.RandomResizedCrop.

Yes, I think if the transformation accepts tensors they might be differentiable. E.g. RandomResizedCrop should be a crop and resize operation, which should not detach the tensor from the computation graph.

1 Like