Optimize with respect to specific dimentions of parameter tensor

I want to optimize a loss with respect to specific dimentions of parameter.
I have a parameter z to optimize, but with respect to certain part of it and want to fix other parts. like this

optimizer = torch.optim.Adam(params=[z[:, start:end]], lr=0.01)

but this code gets
ValueError: can’t optimize a non-leaf Tensor

How can I do it?

I think wrapping that ‘sliced part’ into a Variable and passing it to the optimizer should work.

Sorry, I’m new to pytorch.
After I wrap the sliced part and optimize it like this,

optimizer = torch.optim.Adam(params=[Variable(z[:, start:end])], lr=0.01)

original z will be updated?
If no, how can I get updated z with original shape?

I found wrapping sliced part of parameter by Variable doesn’t update original parameter. I noticed it because loss function doesn’t change.

tensor(1.00000e-03 *
       5.0263)
tensor(1.00000e-03 *
       5.0263)
tensor(1.00000e-03 *
       5.0263)
tensor(1.00000e-03 *
       5.0263)
.
.
.

In order to caliculate my loss function, I have to use whole dimention of parameter.

I solved the problem by creating new Variable as saan77 said and re-concatenating original z and sliced z in train loop.
Thank you!