Model with dynamic number of layers

In order to do a hyperparameter optimization to find out how much cnn layers work best, my module accepts this number in its __init__ function and stores the cnn layers in a python list.
They get chained in the forward function.

Doing this pytorch is ignoring the parameters of that cnn layers, as they are in a list and not a attribute of the module directly.

What is the best way to handle this kind of use case?

Can’t you use nn.ModuleList() in this case?