Access all weights of a model

Thanks for the reply. The models are pre-trained. I have the same question regarding pretrained and quantized model from PyTorch.
Code:
model = models.quantization.resnet18(pretrained=True, quantize=True)
for param_tensor in model.state_dict():
double_x = model.state_dict()[param_tensor].numpy()

I am getting the following error:
double_x = model.state_dict()[param_tensor].numpy()
TypeError: Got unsupported ScalarType QInt8

Can you please share how to convert to numPy array?

Hi, I want to save all the intermediate weights in a variable (not save as a .pth file). How to achieve that?

do I just simply use this code below?
weights = model.parameters()?

model.parameters() will return all trainable parameters so also the bias params, if available.
I’m not sure if you want to explicitly filter out for the weights, but if so you could use model.named_parameters() and filter for "weight" in each parameter’s name.

Thank you for replying, I want to extract the best possible permutation of weights, for which my model gives the lowest loss on the validation set. and do not want to use torch.save(model.state_dict(),path)
after every epoch. rather I want to save it into a variable x, and that x will be updated if my current validation loss (at epoch e) is lesser than the previous validation loss (at epoch e-1).
Is there a way to do that? like save the entire weights congif in a single variable?

I’m unsure why saving the stat_dict into a single file (and overwriting the previous state_dict) in an epoch with a new lowest validation loss wouldn’t work.

1 Like

alright, got it, thank you

How could I separate the weights of the last fully connected layer after training for few local epoch?

I also need to access weights for every epoch separately, SO if you find any, please let me know. Thanks.

I had the same question today. The call model.parameters() is a generator which means it can be accessed in the same way as any generator using the unpack operator *.
In this case, to get a list of the parameters we can do:

param_list = [*model.parameters()]

Note, this gives a list of tensors, each of which is a tensor of the parameters for a part of the model. They alternate between weights and biases, for example the last layer is arranged as follows:

final_layer_bias_tensor = param_list[-1]
final_layer_weight_tensor = param_list[-2]

Obviously, if you have a big model do be aware of memory limitations.