I want to do something like this, but I need it be be differentiable w.r.t the index-tensors. is there any possibility to achieve this?

```
import torch
# initialize tensor
tensor = torch.zeros((1, 400, 400)).double()
tensor.requires_grad_(True)
# create index ranges
x_range = torch.arange(150, 250).double()
x_range.requires_grad_(True)
y_range = torch.arange(150, 250).double()
y_range.requires_grad_(True)
# get indices of flattened tensor
x_range = x_range.long().repeat(100, 1)
y_range = y_range.long().repeat(100, 1)
y_range = y_range.t()
tensor_size = tensor.size()
indices = y_range.sub(1).mul(tensor_size[2]).add(x_range).view((1, -1))
# create patch
patch = torch.ones((1, 100, 100)).double()
# flatten tensor
tensor_flattened = tensor.contiguous().view((1, -1))
# set patch to cells of tensor_flattend at indices and reshape tensor
tensor_flattened.scatter_(1, indices, patch.view(1, -1))
tensor = tensor_flattened.view(tensor_size)
# sum up for scalar output for calling backward()
tensor_sum = tensor.sum()
# calling backward()
tensor_sum.backward()
# alternative to avoid summing tensor:
tensor.backward(torch.ones_like(tensor))
```