So state_dict, to add something to it, you pick a parameter, and you say state = self.state(parameter), and then you put something in that parameters state. But I have a variable that it would be more convenient to store globally, not in every parameter state, for now I just say self.variable = …, but that wouldn’t get saved and loaded in state_dict. So I was wondering, is there any way to save such variable to optimizer state_dict?
I’m unsure if I understand your use case correctly, so please let me know if I misunderstand the “global” meaning.
self.variable = ... will register an nn.Parameter inside an nn.Module and will add it to the state_dict. If you want to register a non-trainable tensor inside this module and also want to add it to the state_dict, you could use self.register_buffer (with persistent=True, which is the default).
The optimizer receives references to parameters in its initialization from the model. Unsure, why you want to add new parameters manually.
Sorry I wasn’t too clear on this, I am creating an optimizer, and optimizers also has their own state dicts where they have stuff like momentum, but it is stored in a dictionary, where each parameter is a key to the dictionary, like state = self.state(parameter), and that way you can easily store momentum for each parameter. But I have a something where it would be more convenient if it was global and not linked to a particular parameter. So I was wondering if I could have that but still be able to save that variable to the optimizer’s state_dict.
Would it be possible to add your custom parameter to the parameter group instead of the state?
The state_dict would contain both as seen here.