Cannot update ModelDict with another ModelDict

Hi. I’m new to this forum, so I hope I’m making myself clear enough :sweat_smile:. I’m working on a project using pytorch 1.4.0 and I’m getting an error while trying to merge two ModelDict instances. According to the documentation of torch.nn.ModelDict, a ModelDict object can be appended to another one using the update() function. I tried this with the following code:

model1 = torch.nn.ModuleDict(OrderedDict([
          ('conv1', torch.nn.Conv2d(1,20,5)),
          ('relu1', torch.nn.ReLU())]))

model2 = torch.nn.ModuleDict(OrderedDict([
          ('conv1', torch.nn.Conv2d(1,20,5)),
          ('relu1', torch.nn.ReLU())]))

model1.update(model2)

but this gives the following error:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-157-967ebf8b86d0> in <module>
     10 print(type(model2))
     11 
---> 12 model1.update(model2)

c:\python36\envs\modern_pointcn\lib\site-packages\torch\nn\modules\container.py in update(self, modules)
    348                     raise ValueError("ModuleDict update sequence element "
    349                                      "#" + str(j) + " has length " + str(len(m)) +
--> 350                                      "; 2 is required")
    351                 self[m[0]] = m[1]
    352 

ValueError: ModuleDict update sequence element #0 has length 5; 2 is required

by looking into the source code, it seems a torch.nn.ModelDict is not recognized as a Mapping. Is this a bug? If not, how can I append two ModelDict instances?

This might be a bug, as the condition is apparently checking the length of the key instead of the tuple.
Could you please create an issue on GitHub?

Thank you for the feedback. Yes, I’ll open an issue on Github.