Access all weights of a model

Thanks for the reply. The models are pre-trained. I have the same question regarding pretrained and quantized model from PyTorch.
Code:
model = models.quantization.resnet18(pretrained=True, quantize=True)
for param_tensor in model.state_dict():
double_x = model.state_dict()[param_tensor].numpy()

I am getting the following error:
double_x = model.state_dict()[param_tensor].numpy()
TypeError: Got unsupported ScalarType QInt8

Can you please share how to convert to numPy array?

Hi, I want to save all the intermediate weights in a variable (not save as a .pth file). How to achieve that?

do I just simply use this code below?
weights = model.parameters()?

model.parameters() will return all trainable parameters so also the bias params, if available.
I’m not sure if you want to explicitly filter out for the weights, but if so you could use model.named_parameters() and filter for "weight" in each parameter’s name.

Thank you for replying, I want to extract the best possible permutation of weights, for which my model gives the lowest loss on the validation set. and do not want to use torch.save(model.state_dict(),path)
after every epoch. rather I want to save it into a variable x, and that x will be updated if my current validation loss (at epoch e) is lesser than the previous validation loss (at epoch e-1).
Is there a way to do that? like save the entire weights congif in a single variable?

I’m unsure why saving the stat_dict into a single file (and overwriting the previous state_dict) in an epoch with a new lowest validation loss wouldn’t work.

1 Like

alright, got it, thank you