Hi, I used to create leaf variable like:
y = torch.autograd.Variable(torch.zeros([batch_size, c, h, w]), requires_grad=True)
Then I want to assign value to indexed parts of y like below,(y_local is a Variable computed based on other variables and I want to assign the value of y_local to part of the y and ensure that the gradients from y can flow to the y_local.)
y.data[:,:,local_x[i]:local_x[i+1],local_y[i]:local_y[i+1]] = y_local.data
I am wondering such operation supports the normal gradient backward for the y_local varible?