I want to optimize a tensor but not all elements of that tensor. For example:
test_tensor = torch.zeros((1, 10475,3), requires_grad = True)
optimizer = torch.optim.Adam([test_tensor[:,rest_idx,:]], 0.005, betas=(0.9, 0.999))
where rest_idx is a sorted array with any possible elements between 0 and 10474. The code I wrote above will not work because the slices are no leafs anymore. The way that I divide the test_tensor into two parts firstly and concatenate two parts of this tensor afterward is not applicable I guess because the selected indices can be very random.
Is there any efficient way to solve this problem? Thanks a lot in advance!