Accessing parameters and weights during training

Hi All,

In reference to the topic title, is there a way to access the parameters and weights during training?
The proposed solutions I’ve seen so far on the forum are the following:
Using the .parameters() method on the model seems to return the parameters set during the initialization of the model, e.g.,
for p in model.parameters():
print(p.data)
In addition, checking the individual layer of the model using model.Layer.weight.data has the same behavior.
Both the above methods do not change as the models train.
I would like to see the weights changing as a function of epoch/minibatch.

Thanks!!!

Did you take a look at using state_dict? More on that here.

1 Like

Yeah that works. It does require me to send the data in the state_dict to the cpu from the gpu if I want to append it to a list and not have it be a duplicate of the initialized parameter though. Thanks!

1 Like