The typical method to assigning buffers to a module is obviously using
foo = torch.random.rand(3, 4) self.register_buffer('buffer', foo)
and the buffer can be accessed by
self.buffer elsewhere in the module class.
My question is, is it possible to first explicitly perform an assignment (
self.buffer = ...), then somehow “mark” this attribute as a buffer? While this does not affect the actual runtime behavior, I rely a lot on my IDE’s hinting / auto-complete features, and the IDE cannot detect attributes if they are not assigned values by an assignment statement somewhere else (preferrably in
__init__) in the first place.
It seems that parameters / modules can be registered by assignment, instead of using the dedicated methods
add_module. I wonder if the same can be done for buffers.