Gradients over Transformation operations

I am applying some transformations on an Image. Is it possible to backpropagate over the operations using the autograd ?

data_transforms = transforms.Compose([

I am fairly new to PyTorch. excuse me if I am asking something too obvious.

Those operations are defined on PIL library. Gradients are tracked if and only if you use pytorch tensors and pytorch operators.
If you reimplement them in terms of pytorch operators you can do so.

1 Like

I think kornia is able to differentiate through (some) operations.
CC @edgarriba, who is one of the main authors of this lib.

1 Like