In the context of having a trained model. Is there a way to turn a parameter into a buffer to keep it fixed, as an alternative to freezing that parameter?
Seems this can be done like this:
for name, param in module.named_parameters(recurse=False):
delattr(module, name) # Unregister parameter
module.register_buffer(name, param)
recurse
must be set to false, because in case there is a submodule with its own parameters, doing this would fail. Instead, this should be repeated for all submodules.
Edit:
Here is a function that does this:
def param_to_buffer(module):
"""Turns all parameters of a module into buffers."""
modules = module.modules()
module = next(modules)
for name, param in module.named_parameters(recurse=False):
delattr(module, name) # Unregister parameter
module.register_buffer(name, param)
for module in modules:
param_to_buffer(module)