Store the latest input data into a layer

In my code, I need to store the latest input data into a layer (Here is _AdaptiveAvgPoolNd ) and do the average pooling operation on the subtraction of new input data and latest input data, then updating the latest value. I was wondering what is the best way to define the latest_inpu? With the variable, I get following error during the training

class _AdaptiveAvgPoolNd(nn.Module):
    __constants__ = ['output_size']

    def __init__(self, batch,channel,width,height,output_size):
        super(_AdaptiveAvgPoolNd, self).__init__()
        self.output_size = output_size
        self.latest_input = Variable(torch.zeros(batch,channel,width,height), requires_grad=True)

    def extra_repr(self):
        return 'output_size={}'.format(self.output_size)


class AdaptiveAvgPool2d_modified(_AdaptiveAvgPoolNd):
   
    def forward(self, input):

        output = F.adaptive_avg_pool2d((input - self.latest_input), self.output_size)
        self.latest_input = input
            
        return output
Variable._execution_engine.run_backward(
RuntimeError: Trying to backward through the graph a second time, but the saved intermediate results have already been freed. Specify retain_graph=True when calling backward the first time.

I guess your issue is your latest_input interfers with the current graph as the error suggest so, just do self.latest_input.clone().detach() so your only providing the data needed.

1 Like