I figured that if I create a list of modules within a model, it seems that when I do model.cuda(), those parameters in the modules in the list are not affected. It seems that .parameter() doesn’t take into account the modules in the list?
1 Like
Hmm, I still don’t quite understand how .parameter() works. What kind of modules does it automatically keep track of?
This sounds like a minor bug that pytorch should consider fixing?
.parameter()
works like this: it walks all members of the class (anything added to self
) and does one of three things with each member:
- If the member is a parameter (something registered with
register_parameter(...)
or of typenn.Parameter
), it adds it to the parameters list. - If the member is of type (or is a subclass of)
nn.Module
,.parameter()
is called recursively. - Otherwise, it is ignored
In theory, you could add a 4th option to handle lists, but nn.ModuleList
was chosen instead.
5 Likes
Ah I see, so if I create a nn.ModuleList insteadl of list, it allows the .parameter() to use option 2 to recursively find the parameters?
Thanks!