Hi everyone,

i have a question about optimizer.step().Let’s say i have a function g depending on a variable t for instance,and i want to parametrize g by a neural network.Let’s say there are only two parameters x and y,just to simplify the problem.Is it possible to use optimizer.step() in such a way that only the parameter y is updated after using loss.backward,and the parameter x is unchanged?