Is there a way to make models.[anynet](pretrained=True) [anynet] configureable as the parameter from the function?

i.e

From torchvision import models
model = models.vgg19(pretrained=True)

vgg19 is the net I set here to be used. Currently, I have a list of net(s) I want to use as parameters in my function below. Each time it will take one value from the list and return the above example.

Example list:[VGG19, resnet50 ,vit_b_16]

def fe_net(self, extractor):
    model = str(models + '.' + extractor + 'pretrained=True')
    modules = list(model.children())[:-1]  # delete the last fc layer.
    feature_extractor = nn.Sequential(*modules)
    for param in feature_extractor.parameters():
        param.requires_grad = False
    return feature_extractor

The current error message shows:

model = str(models + '.' + extractor + 'pretrained=True') TypeError: unsupported operand type(s) for +: 'module' and 'str'

This blog post might be helpful as it explains how util. methods can be used to get all models.

@ptrblck Thanks so much! That definitely helped! One question is that post does not have a parameter as

pretrained=True . Does

weights="DEFAULT" works exactly same?

Not exactly, as the DEFAULT weights might change between different torchvision releases as described in Initializing pre-trained models.
Currently DEFAULT seems to refer to IMAGENET1K_V2 for e.g. resnet18 while the deprecated pretrained=True argument should refer to IMAGENET1K_V1 as given in the warning:

>>> torchvision.models.resnet18(pretrained=True)
UserWarning: The parameter 'pretrained' is deprecated since 0.13 and may be removed in the future, please use 'weights' instead.
UserWarning: Arguments other than a weight enum or `None` for 'weights' are deprecated since 0.13 and may be removed in the future. The current behavior is equivalent to passing `weights=ResNet18_Weights.IMAGENET1K_V1`. You can also use `weights=ResNet18_Weights.DEFAULT` to get the most up-to-date weights.
1 Like