Updating few channels of the input

I’m trying to optimize specific channels of the input. So, my network weights are fixed and I just want to update select channels of the input. I was able to narrow down the issue to following bits of code. In my understanding following two pieces of code should be identical:

parameters = []
for i in range(nTot):
    parameters.append(inpt[i][:,:channel ])
    parameters.append(inpt[i][:,channel:])
optimizer = optim.Adam(parameters, lr=model.lr, weight_decay=0)

and

parameters = []
for i in range(nTot):
    parameters.append(inpt[i])
optimizer = optim.Adam(parameters, lr=model.lr, weight_decay=0)

Here inpt is a list of tensors and channel is an integer. I find that in 2nd case, my inputs gets updated but in first case, it does not. If this works then I’ll modify the first code and will then be successful in updating only few channels of the input. Many thanks !!

Your first approach won’t work as you are trying to optimize a non-leaf tensor by slicing it and this error is raised:

ValueError: can't optimize a non-leaf Tensor

If you want to optimize parts of a parameter, either zero out the gradients of the frozen part or create separate parameters and use e.g. torch.cat or torch.stack to create the large parameter before applying it in your model.

1 Like

Thanks a lot for your reply. I was able to do this with something similar to second approach suggest by you !