Freezing some layers of weights


I’m trying to freezing some layers of weight parameters.

For example, I have CONV layer such as n * c * k * k (n is output channel depth, c is input channel depth, and k is kernel size).

I make some kernels to zero so that CONV layer shape goes to n’ * c * k * k.

I want to freeze specific zeroed weights. It means I wanna train n’ * c * k * k weights.

“required_grad” do not work in this case.

You cannot set the requires_grad attribute to a part of your parameters, but would have to zero out the gradients after each .backward() call on the specific values to keep them static.