Update grad_fn of net.parameters()

Say I have a list of parameters called p_new and each parameter has a grad_fn which I care about.

I would like to update net.parameters() like this:

for param, pi in zip(net.parameters(), p_new):
        param = pi # I don't wanna update data only!

Of course this doesn’t work. Any alternatives?

Hi,

You can copy into a Tensor in a differentiable way with param.copy_(pi).
The problem is that Parameters are leafs (they never have history) so you won’t be allowed to do this.
If you don’t want them to be Parameters anymore and want them to have history, you can simply delete the existing parameter del net.foo and set a plain Tensor to replace it net.foo = your_tensor.

Thanks! But how do I do that for all parameters?
Like this?

for param, pi in zip(net.parameters(), p_new):
        del param
        param = pi

I guess no.

Also, after the update I want to do a forward pass. Will I be able to do that?

When you do net.paramters(), you only get the Parameter, not the parent Module.
You can check the named_parameters() to get both the Parameter and the name to access it.
You can do something that looks like the following:

def del_obj(mod, name):
    names = name.split(".")
    for n in names[:-1]:
        mod = mod[n]
    del mod[names[-1]]

def set_obj(mod, name, new_val):
    names = name.split(".")
    for n in names[:-1]:
        mod = mod[n]
    mod[names[-1]] = new_val

for (name, _), pi in zip(net.named_parameters(), p_new):
        del_obj(net, name)
        set_obj(net, name, pi)

I get the error: model object is not subscriptable.

FYI the names of the parameters are:

fc.0.weight
fc.0.bias
fc.1.weight
fc.1.bias

And even if I try explicitly del fc.0.weight I get: TypeError: ‘Linear’ object does not support item deletion

You will need to get the linear first by doing lin = fc[0] then you can delete the weights as del lin.weight.