How to freeze alternate filter/channels within a model?

I have a layer named “conv2” and want to freeze all filter’s weights which are at even index positions.

I am trying this:

for i, filter_val in enumerate(model_eval.conv2.weight):
    if i % 2 == 0:
        filter_val.requires_grad = False

I get this error:

RuntimeError: you can only change requires_grad flags of leaf variables. If you want to use a computed variable in a subgraph that doesn't require differentiation use var_no_grad = var.detach().

Since each filter is created from the layer, which requires grad to be set to True of False, I cannot set individual filter’s which are a subset of the layer to True of False.

Is there any work around for this ?

It is not possible to assign requires_grad at the subtensor level. A way around this is to take a view and .detach(), filter_val instead. But you need to make sure you don’t use conv2.weight directly later during forward.