ValueError:optimizer got an empty parameter list

You could either use nn.ModuleList or nn.Sequential to store the layers. Python lists and dicts aren’t the right ones for storing torch modules.

1 Like