I have a model with a constant that is used in the forward pass in some
I want to use
DataParallel, so I’ve registered the constant
Variable as a buffer (
self.register_buffer(buffer_name, constant_variable)) so that it will be replicated with the model.
I have a second model (which inherits from the first) that wants to override/modify the constant
When I try
self.buffer_name = new_variable I get an error saying that I have to set a buffer using a
TypeError: cannot assign ‘torch.autograd.variable.Variable’ as buffer ‘flow_mean’ (torch.Tensor or None expected)
I was able to register the buffer as a
Variable initially, but am not able to
__setattr__ with a Variable? This doesn’t seem right to me.
I can try to
del self.buffer_name and reset, but that seems like a hack.
Is this behavior intentional or should the
nn.module support setting it’s buffers with Variables?