Hi,
I am reproducing a torch model to a pytorch model. but i have difficulties in how to share the parameters and the gradParameters in two layers.
Here is torch scipt,
Thanks for any help.
Hi,
I am reproducing a torch model to a pytorch model. but i have difficulties in how to share the parameters and the gradParameters in two layers.
Here is torch scipt,
you can reuse the same layer again and again. you dont need to clone a separate layer for a separate output.
For example, this works fine:
m = nn.Conv2d(...)
output1 = m(input1)
output2 = m(input2)
(output1 + output2).sum().backward()