How to drop specific filters in some CNN layers

I have read the 2019 CVPR paper:RePr: Improved Training of Convolutional Filters and want to implement its method. First, we need to train a CNN several iterations. Then we can use a metric algorithm to rank all filters and drop the 30% least important filters to continue to train a few iterations just like pruning. After that, re-initialize the filters that are dropped before and continue to train the networks. We modify the training process by cyclically removing redundant filters, retraining the network, re-initializing the removed filters, and repeating.
In my opinion, drop the filter is not really removing it. It just make the weight of that filters don’t work and don’t update. I want to achieve this goal by resetting the weight of dropped filters to 0. For example, I want drop the 23rd filter in first layer,
params = list(model.named_parameters()) params[0][1][22] = 0
But it has an error when loss.backward():RuntimeError:leaf variable has been moved into graph interior.
Does anyone know an official method or some other methods to drop specific filters in some layers? Thanks very much!

One option can be something like this:

for key, value in dict(model.named_parameters()).items():
    if 'the_name_of_the_conv_you_want_to_zero_out' in key:
        value.data[0, 1, 22] = 0

or, another option:
model.the_name_of_the_conv_you_want_to_zero_out.weight.data[0, 1, 22] = 0

1 Like

However, if you want to prune that network, you have to choose another way.
Here is the post about how to replace network in running time: https://find1dream.github.io/en/How_to_change_CNN_structure_in_running_time_with_pytorch