How to switch variable between state_dict and parameter_dict

Hello,

I have understood difference between register_buffer() and register_parameter/nn.Parameter.

In my models I initialize my variable either into the buffer or as a parameter similar to this code:

if self.is_param == True:
  self.register_buffer("_variable", variable)    
elif self.is_param == False:
  self.variable= torch.nn.Parameter(variable)      

# This is the function call I want to implement (but I have no idea how)
def change_variable_state(flag_param : bool = True):
     if flag_param == True:
            # make the variable a parameter
     elif flag_param == False:
            # make the variable a normal buffer in the state_dict

However, now I would like to be able to move this variable from a Parameter to a "normal buffered variable and vice-versa. I.e. I would look to transformed a buffered variable into a parameter.

In other words: I want to dynamically change the “buffer” state of my variable so that I can see it either in the function named_parameters() or not.

Any help is much appreciated.

Thanks

Hi,

The simplest way I can think of to do that is remove it and add it again:

# from param to buffer
var = self.variable
del self.variable
self.register_buffer("variable", var)

1 Like

Clean and simple. Thanks

In case you want to do something more elaborate, you can directly access the underlying dict. But @albanD 's solution is the best

import torch
model.register_buffer("_variable", torch.FloatTensor([0]))    
model.register_parameter("_parameter", torch.nn.Parameter(torch.FloatTensor([1])))
model._buffers['_parameter'] = model._parameters['_parameter'].cpu().detach()
model._parameters.pop('_parameter')
1 Like