Optimize a part of a tensor with selected index

Hey,

I want to optimize a tensor but not all elements of that tensor. For example:

test_tensor = torch.zeros((1, 10475,3), requires_grad = True)
optimizer = torch.optim.Adam([test_tensor[:,rest_idx,:]], 0.005, betas=(0.9, 0.999))

where rest_idx is a sorted array with any possible elements between 0 and 10474. The code I wrote above will not work because the slices are no leafs anymore. The way that I divide the test_tensor into two parts firstly and concatenate two parts of this tensor afterward is not applicable I guess because the selected indices can be very random.

Is there any efficient way to solve this problem? Thanks a lot in advance!

Best

binary_mask = torch.zeros_like(test_leaf)
binary_mask[idx].fill_(1.)
test_tensor = test_leaf * binary_mask
#if other values are needed
test_tensor += test_leaf.detach() * binary_mask.logical_not()

1 Like

Hey Alex,

could you explain further? I think the way you suggested is only to set the selected elements of the test_tensor to zero. I do want the rest parts to not be optimized.

given test_leaf=[1,2,3] and mask [1,1,0], that code does
test_tensor = [1,2,0] + [0,0,3] = [1,2,3]

because the second summand was detached, test_leaf’s gradient will be zero at positions where mask is zero

note that usual adaptive optimizers like Adam won’t calculate gradient moments correctly for such parameters. There is optim.SparseAdam, but I’m not sure how to use it.