Hypernetworks need variables not parameters

Hi all,

I would like to implement a hyper-network.
For this the output of one network is used to set the weights of another network.
Hence the weights of the second network should be variables not parameters.

Is there any way to do this in pytorch?

Thanks

1 Like

I think the problem is requires_grad setting.
You can override it after creation, e.g.

for p in lstm.parameters():
if p.requires_grad:
p.requires_grad_(False)