What is the correct way to assign new value to buffer?

Hello,

I want to save some intermediate feature values during training. However, I failed to assign new values to existing buffers by getattr() and setattr(), the example is as follows:

class MyNetwork(nn.Module):
    def __init__(self):
        self.conv1 = nn.Conv2d(3, 3, 3, 1, 1)
        self.conv2 = nn.Conv2d(3, 3, 3, 1, 1)
        self.register_buffer('inter_feature', torch.Tensor())
        self.register_buffer('count', torch.tensor(0))
        self.register_buffer('weight', torch.ones(10))
    def forward(self, x):
        x = self.conv1(x)
        x = self.conv2(x)
        
        # case1: does not work for updating self.count
        count = getattr(self, 'count')
        count = count + 1
        setattr(self, 'count', count)
        
        # case2: work
        count = getattr(self, 'count')
        count += 1
        setattr(self, 'count', count)
        
        # case3: does not work
        feature = getattr(self, 'feature')
        feature = x
        setattr(self, 'feature', feature)
        
        # case4: work
        feature = getattr(self, 'feature')
        feature.data = x.data
        
        return x

Could you tell me how can I correctly use getattr and setattr to update buffers during training?

Thanks in advance!

I think use self.xxx = aaa may a good way for change a buffer.

Thanks for reply!
Absolutely, using self.xxx = aaa is a convenient way to update buffers. But, in my code, I need to update buffer like this:

for i in range(stages):
    self.register_buffer('feature_stage{}'.format(i))

In this case, I need to access buffers by getattr(self, feature_stage{}.format(i)), and I cannot use self.feature_stage{}.format(i) = aaa to update a buffer.

Does there exist more efficient way like self.xxx = aaa for my case?

Why not use a matric like a NxM tensor, the num N is stages.
And then you could change by self.xxx[i] = aaa.

1 Like

Thanks a lot, I will have a try :wink: