i have a question regarding the tensor type conversion.
I have a tensor x, which requires_grad, and it is computed from the leaf tensor, which is created by me.
For some reason, i need the tensor x in type of long. So i applied x = x.long(). However, the resulting x doesn’t has requires_grad=True.
I wonder how i can propagate the requires_grad when i want to convert the tensor to type of long .
Indeed, the long tensor cannot have gradient. Then, i wonder if we can make the differentiable indices for the function image.index_put_(indices, values, accumulate=True) ? The indices are the pixel location, which has requires_grad and is computed from the leaf tensor. image is a zero tensors. I want to write some values for specific pixels into image. Is it possible to allow a differentiable indices in this case?
would be disallowed, wouldn’t it?
You could probably try to come up with a custom autograd.Function, which could define (somehow) the backward and gradients for indices although I wouldn’t have a good idea if and how this could work.