How to append model.parameters to optimizer

I don’ know how to append model.parameters to optimizer when some condition is ok.
I just want to add model2.paramters().
so now I have wrote like this but not fancy.
How can make it clear?

> if condition is ok:
> optimizer = optim.Adam(
>             list(model1.parameters()) +
>             list(model2.parameters()),
>             lr=0.001, betas=(0.9, 0.999))
else:
> optimizer = optim.Adam(
>             list(model1.parameters()) +
>             list(model2.parameters()) +
>             list(model3.parameters()),
>             lr=0.001, betas=(0.9, 0.999))

A better way to write it would be:

learnable_params = list(model1.parameters()) + list(model2.parameters())
if condition is True:
    learnable_params += list(model3.parameters())

optimizer = optim.Adam(learnable_params, lr=0.001, betas=(0.9, 0.999))

The idea is, not to repeat the same code (or) parameters twice. They are bound to copy-paste errors.

1 Like

Thanks for the quick reply !!
Is there any fancy way to express this?

    for p in model1.parameters():
        p.requires_grad = True
    for p in model2.parameters():
        p.requires_grad = True
    for p in model3.parameters():
        p.requires_grad = True

Again, I would write a small API set_requires_grad(model, True/False).
and call set_requires_grad(model1, True).

def set_requires_grad(model, bool_val):
    for p in model.parameters():
        p.requires_grad = bool_val

Just a reusing strategy.

1 Like