[solved] Model parameters do not set requires_grad inside of a loop

This is a long-winded description of a simple problem.

I’m trying to create a very simple dictionary-base training scheduler for resnet18. Inside of the loop I’m setting the requires_grad = True. But upon the 2nd iteration the model parameters sets are not taking. It’s like I’m holding onto a reference for a different set of layers but I’m not really sure how to proceed.

Here’s a simplified version of the code with debug prints. If you look at the “debug section 2” for the second iteration you’ll see that the model parameter flags did not change.

schedule = [
    {'label': 'fc', 'tries': 1, 'epochs': 1,
     'params': [model.fc]},

    {'label': 'fc, layer4', 'tries': 1, 'epochs': 1,
     'params': [model.fc, model.layer4]}]

best_acc = 0.0

for param in model.parameters():
    param.requires_grad = False

for s in schedule:
    print('\n----------------------------------------------------------------')
    print(f"Training layers {s['label']}")

    print('[Debug section 1]')
    for layer in s['params']:
        for param in layer.parameters():
            print(param.requires_grad, end=' ')
            param.requires_grad = True
            print(param.requires_grad)

    print('[Debug section 2]')
    for param in model.parameters():
        print(param.requires_grad, end=' ')
    print()

    for t in range(1, s['tries'] + 1):
        print(f"Training layers {s['label']} try {t}/{s['tries']}")

        model, best_acc = train_model(model,
                                      criterion,
                                      params=s['params'],
                                      num_epochs=s['epochs'],
                                      best_acc=best_acc,
                                      init_lr=1e-4,
                                      lr_decay_epoch=75)

Edited output:
Training layers fc
[Debug section 1]
False True
False True
[Debug section 2]
False False False False False False False False False False False False
False False False False False False False False False False False False
False False False False False False False False False False False False
False False False False False False False False False False False False
False False False False False False False False False False False False
True True
Training layers fc try 1/1

----------------------------------------------------------------
Training layers fc, layer4
[Debug section 1]
True True
True True
False True
[13 More like this]
False True
[Debug section 2]
False False False False False False False False False False False False
False False False False False False False False False False False False
False False False False False False False False False False False False
False False False False False False False False False False False False
False False False False False False False False False False False False
True True 
Training layers fc, layer4 try 1/1

I figured out solution. Working under the assumption that a reference was changed (probably at model = train_model) I changed the loop to use layer names and not references to layers.