Workaround for 'ValueError: Cannot assign non-leaf Tensor to parameter 'weight''

This is related to Discussion of practical speedup for pruning

I am trying to set the weights of one model to another like so:

    mod = model.msd.msd_block # some module
    pruned_mod = pruned_model.msd.msd_block # some module
    it = 0 # this is because some layers can be removed, not important for question
    for l in req_layers:
        pruned_weight_name ="weight"+str(l)
        weight_name ="weight"+str(it)
        pruned_weights = getattr(pruned_mod, pruned_weight_name)
        weights = getattr(mod, weight_name)
        it2 = 0
        for c in req_channels[it]:
            weights[0,it2] = pweights[0,c]
            it2 += 1
        setattr(mod, weight_name, weights)
        it += 1

This throws the error:

ValueError: Cannot assign non-leaf Tensor to parameter 'weight0'. Model parameters must be created explicitly. To express 'weight0' as a function of another Tensor, compute the value in the forward() method.

I checked that the tensor weights is the one I want but without setattr(...) it does not update the weights in the model. Can anyone help me with a workaround?

Background
For pruning speedup purposes, I have created a new class with fully pruned layers removed. In the remaining layers, the weight tensors are smaller than in the original. The new class keeps track of which non-pruned channels are inputs. The forward method etc. already works and there is a nice speed up! The only thing I need to do is set the kernel-weights to the original weights (which should be simple) but this is holding me back now.

Solved
It turns out my check function was off by one. When you use getattr() to grab the weights, updating them with weights = ... will update the model weights. Not entirely sure why setattr(...) throws this error though.