Hi.
I’m trying to freezing some layers of weight parameters.
For example, I have CONV layer such as n * c * k * k (n is output channel depth, c is input channel depth, and k is kernel size).
I make some kernels to zero so that CONV layer shape goes to n’ * c * k * k.
I want to freeze specific zeroed weights. It means I wanna train n’ * c * k * k weights.
“required_grad” do not work in this case.