I want to set a portion of a tensor (such as x[1:3, 5:10, :]
) to have requires_grad=False
. Is this possible?
No, that’s not directly possible since the .requires_grad
attribute is used for the entire tensor.
You could either zero out the gradients of the frozen tensor part, restore their values after the rest was updated, or recreate the x
tensor from smaller tensors having different .requires_grad
settings via torch.cat
or torch.stack
.
That’s what I though too. I wish it was possible though. For example, if the requires_grad
flag is instead made to contain indices of the tensor, then optimizer.step
could perform the gradient update to only those indices. This wouldn’t hurt performance as the optimizer could fallback on the existing way of performing the update if requires_grad
takes a boolean value.