Instantiating VGG16 with state dictionary without downloading weights

I do have a pth file for vgg16 as vgg16-397923af.pth. I like to instantiate a new vgg16 model with exact same weights in vgg16-397923af.pth file. vgg16 = models.vgg16() always tries to download weights first, however I don’t want to use any downloads just a reference to the pth file. Is this possible?

That’s not the case as the default argument will randomly initialize the parameters:

model = models.vgg16()
# no output
model = models.vgg16(pretrained=True)
# Downloading: "" to ...

and will store it in ~/.cache/torch/hub/checkpoints by default.

Thanks ptrblck,
when model = models.vgg16() it download weights, but when model = models.vgg16(pretrained=False) it randomly initializes the parameters.

pip list | grep torch
torch 2.0.1
torchaudio 2.0.2
torchvision 0.15.2

It’s not doing it for me as mentioned in my previous post and also in the docs:

weights (VGG16_Weights, optional) – The pretrained weights to use. See VGG16_Weights below for more details, and possible values. By default, no pre-trained weights are used.

so you would need to come up with a way to reproduce the issue.

Well you are correct. Looks like I was absent minded when posting this. It is great to have you here ptrblck, rapidly responding to everything since from the beginning.

Good to hear, it’s indeed not the case as I was struggling to find any way this would be possible. :slight_smile: