Hi there,
I have a model with a variable amount of custom Layers, therefore I use torch.nn.ModuleList - now I would like to set the per parameter option in my optimizer, but the parameters are not grouped with a name (as in torch.nn.ModuleDict). How can I still set per-parameter options?
class my_model(torch.nn.Module):
def __init__(self, channel_dimensions):
super().__init__()
self.layers = torch.nn.ModuleList([CustomLayer(channel_dimension[i+1]), channel_dimension[i]) for i in range(len(channel_dimensions))])
def forward(self, x)
for layer in self.layers:
x = layer(x)
return x
Maybe something like
optim.SGD([
{'params': MyModel.layers[-1].parameters()}
], lr=1e-2, momentum=0.9)
?