Access variables inside Modules

Good Morning everyone.

I have an issue that should be apparently obvious but for which I didn’t find an optimal nor elegant solution yet.

I have a class My_model(nn.Module): {.....} which returns a feature tensor. All the rest of my code is build according to this, therefore I could not change what is returned.

Inside the model I create a variable (essentially some attention weights) that I would like to access from outside the model in order to store them in a log file. What’s the most “pytorchy” way to do that?

Do you assign the attention weights to self? If so, you can just call model.attention_weights outside and get the parameter.