Max Unpooling Indices

I have a Convolutional autoencoder in which I have 2 Max Pooling and 2 Max unpooling layers.
It can have 3 or 4 pooling layers, it is created dynamically.

(0): Conv2d(1, 32, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(1): ReLU()
(2): MaxPool2d(kernel_size=(2, 2), stride=(2, 2), padding=0, dilation=1, ceil_mode=False)
(3): Conv2d(32, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(4): ReLU()
(5): MaxPool2d(kernel_size=(2, 2), stride=(2, 2), padding=0, dilation=1, ceil_mode=False)
(6): Flatten()
(7): Linear(in_features=3136, out_features=32, bias=True)
(8): ReLU()
(9): Linear(in_features=32, out_features=3136, bias=True)
(10): ReLU()
(11): ReshapeLayer()
(12): MaxUnpool2d(kernel_size=(2, 2), stride=(2, 2), padding=(0, 0))
(13): ConvTranspose2d(64, 32, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(14): ReLU()
(15): MaxUnpool2d(kernel_size=(2, 2), stride=(2, 2), padding=(0, 0))
(16): ConvTranspose2d(32, 1, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
(17): ReLU()
)

The problem is I need to store the indices during MaxPooling, but since this is a dynamic network I am not sure how to do this.

This is how my forward function looks like:
def forward(self, x):
for layer in self.module_list:
x = layer(x)
return x

The easiest approach without changing your code too much is probably to add conditions based on the current layer:

def forward(self, x):
    pool_idx = []
    for layer in self.module_list:
        if isinstance(layer, nn.MaxPool2d):
            ...
            pool_idx.append(idx)
        elif isinstance(layer, nn.MaxUnpool2d):
            idx = pool_idx.pop(-1)
            ...
        else:
            ...
1 Like