When using JIT tracing a python model, it throw out an error:
ValueError: TracedModules don't support parameter sharing between modules
But I do not understand what is the difference between paramaters sharing and reuse, doesn’t they are the same thing?
we define a Model into a class, and defined some modules when init, then call it one by one in forward, does this not resuse? Then what is the sharing parameters case?
In my model, How do I find those layers sharing parameters where everywhere I thought it was reuse module rather than parameters sharing.
From this architecture, does it can figure out whether it shared parameters or not?
unet8: TUM(
(layers): Sequential(
(0): BasicConv(
(conv): Conv2d(384, 256, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
(bn): BatchNorm2d(256, eps=1e-05, momentum=0.01, affine=True, track_running_stats=True)
(relu): ReLU(inplace)
)
(1): BasicConv(
(conv): Conv2d(256, 256, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
(bn): BatchNorm2d(256, eps=1e-05, momentum=0.01, affine=True, track_running_stats=True)
(relu): ReLU(inplace)
)
(2): BasicConv(
(conv): Conv2d(256, 256, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
(bn): BatchNorm2d(256, eps=1e-05, momentum=0.01, affine=True, track_running_stats=True)
(relu): ReLU(inplace)
)
(3): BasicConv(
(conv): Conv2d(256, 256, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
(bn): BatchNorm2d(256, eps=1e-05, momentum=0.01, affine=True, track_running_stats=True)
(relu): ReLU(inplace)
)
(4): BasicConv(
(conv): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), bias=False)
(bn): BatchNorm2d(256, eps=1e-05, momentum=0.01, affine=True, track_running_stats=True)
(relu): ReLU(inplace)
)
)
(toplayer): Sequential(
(0): BasicConv(
(conv): Conv2d(256, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
(bn): BatchNorm2d(256, eps=1e-05, momentum=0.01, affine=True, track_running_stats=True)
(relu): ReLU(inplace)
)
)
(latlayer): Sequential(
(0): BasicConv(
(conv): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn): BatchNorm2d(256, eps=1e-05, momentum=0.01, affine=True, track_running_stats=True)
(relu): ReLU(inplace)
)
(1): BasicConv(
(conv): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn): BatchNorm2d(256, eps=1e-05, momentum=0.01, affine=True, track_running_stats=True)
(relu): ReLU(inplace)
)
(2): BasicConv(
(conv): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn): BatchNorm2d(256, eps=1e-05, momentum=0.01, affine=True, track_running_stats=True)
(relu): ReLU(inplace)
)
(3): BasicConv(
(conv): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn): BatchNorm2d(256, eps=1e-05, momentum=0.01, affine=True, track_running_stats=True)
(relu): ReLU(inplace)
)
(4): BasicConv(
(conv): Conv2d(384, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn): BatchNorm2d(256, eps=1e-05, momentum=0.01, affine=True, track_running_stats=True)
(relu): ReLU(inplace)
)
)
(smooth): Sequential(
(0): BasicConv(
(conv): Conv2d(256, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
(bn): BatchNorm2d(256, eps=1e-05, momentum=0.01, affine=True, track_running_stats=True)
(relu): ReLU(inplace)
)
(1): BasicConv(
(conv): Conv2d(256, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
(bn): BatchNorm2d(256, eps=1e-05, momentum=0.01, affine=True, track_running_stats=True)
(relu): ReLU(inplace)
)
(2): BasicConv(
(conv): Conv2d(256, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
(bn): BatchNorm2d(256, eps=1e-05, momentum=0.01, affine=True, track_running_stats=True)
(relu): ReLU(inplace)
)
(3): BasicConv(
(conv): Conv2d(256, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
(bn): BatchNorm2d(256, eps=1e-05, momentum=0.01, affine=True, track_running_stats=True)
(relu): ReLU(inplace)
)
(4): BasicConv(
(conv): Conv2d(256, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
(bn): BatchNorm2d(256, eps=1e-05, momentum=0.01, affine=True, track_running_stats=True)
(relu): ReLU(inplace)
)
)
)