Requires_grad = False on Specific Filters Weights

Hello!

I am working on pruning filters of a fully convolutional neural network, and am starting off by zeroing out the weights of the filters I deem least important, for all layers. I’d like to achieve something similar to setting a requires_grad = False flag for specific filters within a layer. I know that setting requires_grad = False on a layer basically freezes a certain layer’s weights from being updated with the backprop steps, however I’d like that to be the case for only some weights of that layers, not all. Are you aware / do you have any suggestions on how to do so?

Thanks so much!!

Giulio