I’m trying to set one element of weight to 1 and then hold it until the end (prevent it from updating in the next epochs). I know I can set requires_grad but I just want it for one element?
Hi Afsaneh!
The short answer is to set the specific element in question back to
1 after calling opt.step()
.
As you’ve recognized, requires_grad
applies to an entire tensor,
not separately to individual elements. It is also true that an Optimizer
applies to entire tensors, and not to individual elements. There are
a number of approaches to “freezing” individual elements of a tensor
that is otherwise being optimized, but simply resetting the desired
values after running opt.step()
is probably the most straightforward.
Best.
K. Frank
1 Like