hdkgr
(Stefan)
October 11, 2018, 1:02pm
#1
I noticed deepcopy
ing a module causes its parameters()
to be tensor
s rather than nn.Parameter
s.
import torch.nn
import copy
l = torch.nn.Linear(3,1)
c = copy.deepcopy(l)
print([type(p) for p in l.parameters()])
print([type(p) for p in c.parameters()])
[<class 'torch.nn.parameter.Parameter'>, <class 'torch.nn.parameter.Parameter'>]
[<class 'torch.Tensor'>, <class 'torch.Tensor'>]
Why does this happen and can this cause any problems when working with the copy of the module later on?
ptrblck
October 13, 2018, 1:29am
#2
Thanks for reporting it!
It might be this old bug again.
smth
October 13, 2018, 1:33am
#3
@hdkgr what PyTorch version are you on? print(torch.__version__)
ptrblck
October 13, 2018, 1:36am
#4
I could reproduce this issue in 1.0.0.dev20181007
and 1.0.0a0+dfad8b6
.
hdkgr
(Stefan)
October 13, 2018, 10:13am
#5
I don’t have the system at hand, but I’m quite sure it’s the 0.4.1
stable build.
smth
October 18, 2018, 7:12pm
#6
This looks like a regression in 0.4.0 / 0.4.1, We reopened the issue and an engineer is working on issuing a fix.
1 Like