Setting nn.Module buffers with autograd.Variables

I have a model with a constant that is used in the forward pass in some autograd.Variable math.
I want to use DataParallel, so I’ve registered the constant Variable as a buffer (self.register_buffer(buffer_name, constant_variable)) so that it will be replicated with the model.
I have a second model (which inherits from the first) that wants to override/modify the constant Variable buffer_name.
When I try self.buffer_name = new_variable I get an error saying that I have to set a buffer using a Tensor.

TypeError: cannot assign ‘torch.autograd.variable.Variable’ as buffer ‘flow_mean’ (torch.Tensor or None expected)

I was able to register the buffer as a Variable initially, but am not able to __setattr__ with a Variable? This doesn’t seem right to me.
I can try to del self.buffer_name and reset, but that seems like a hack.
Is this behavior intentional or should the nn.module support setting it’s buffers with Variables?

buffers have to be Tensors not Variables. It seems like a bug to allow you to set buffers as Variables the first time.

1 Like