Freeze some layers EfficientNet

Dear all,

I would like to freeze not just the last fully connected layer of EfficientNet-b0 but also some of the previous block to apply transfer learning to a fairly different domain.

This is something I have been able to achieve with ResNet18:

  cntr=0
        for name, child in model.named_children():
            cntr+=1
            if cntr < lt:
                print(f'{name} not trainable')
                for param in child.parameters():
                    param.requires_grad = False
            else:
                print(f'{name} trainable')

But found complex with EfficientNet!