In torch, we can share parameters between two nets like this,
local r1 = nn.Recurrent(
opt.embeddingSize,
cnn,
h2h,
nn.Identity(),
opt.sampleSeqLength)
local r2 = nn.Recurrent(
opt.embeddingSize,
cnn:clone('weight','bias','gradWeight','gradBias'),
h2h:clone('weight','bias','gradWeight','gradBias'),
nn.Identity(),
opt.sampleSeqLength)
so, how to share weights between two nets in pytorch?
shufanwu
(Wushufan)
2
I think you can refer to this topic:
smth
3
you can simply use the same nets on different outputs. In Torch, each net is tied to a particular output, in PyTorch it is not.
Baichuan
(Baichuan)
5
Hi, can you explain it ? Thanks
Baichuan
(Baichuan)
6
we can implement weight sharing among the innermost layers by simply reusing the same Module multiple times when defining the forward pass. please see https://pytorch.org/tutorials/beginner/examples_nn/dynamic_net.html#pytorch-control-flow-weight-sharing
Yes, you’re right,this is how pytorch creates model with sharing weight.
Baichuan
(Baichuan)
8
Do you use torch instead of pytorch?
No, I use pytorch instead of torch.