Optimize only for slice of tensor (or single entries)

I’m trying to use Autograd in order to optimize some parameters stored inside a tensor.
Let’s say I have a tensor of size 10 and I would like to optimize for only the first parameter of that tensor.
So I call:
torch.optim.Adam([variables[0]], lr=self.lr)

The problem is that when selecting the entry I performed an operation that is tracked by the autograd graph and I don’t have a leaf node anymore.
What can be an easy way to solve this?
I need this kind of setup because my variables contain lots of information that I need to visualize and compute the loss, but I don’t want to optimize for all the entries in the tensor.

Help is appreciated.

Hello Nio,

I am not sure if I understand the problem, but you can try to create two tensors, one with the variables that you need to optimize and the other with the other set of variables, and when using the vector concatenate those two tensors.

I don’t know if that applicable to your problem, but maybe it could help.