Can I use same pooling layer?

Pooling layers dosn’t have paramaters to learns.

So if I build the following network:

import torch
import torch.nn          as nn
import torchinfo


class PoolModel(nn.Module):
    
    def __init__(self):
        super().__init__()

        self.GRU1      = nn.GRU(input_size = 2, hidden_size = 32, num_layers = 2, batch_first=True)
        self.Pooling1  = nn.AvgPool1d(kernel_size = 2, stride = 2)
        self.GRU2      = nn.GRU(input_size = 2, hidden_size = 16, num_layers = 2, batch_first=True)
        
        # do I need the second pooling layer or can I use Pooling1 ?
        # self.Pooling2   = nn.AvgPool1d(kernel_size = 2, stride = 2) 
        

    def forward(self, X):
        
        output, h = self.GRU1(X)
        output    = self.Pooling1(output)
        output, h = self.GRU2(X)
        
        output    = self.Pooling1(output)
        # or:
        #output    = self.Pooling2(output)
        
        return output

torchinfo.summary(PoolModel(), (512, 20, 2))  

==========================================================================================
Layer (type:depth-idx)                   Output Shape              Param #
==========================================================================================
PoolModel                                [512, 20, 8]              --
├─GRU: 1-1                               [512, 20, 32]             9,792
├─AvgPool1d: 1-2                         [512, 20, 16]             --
├─GRU: 1-3                               [512, 20, 16]             2,592
├─AvgPool1d: 1-4                         [512, 20, 8]              --
==========================================================================================
Total params: 12,384
Trainable params: 12,384
Non-trainable params: 0
Total mult-adds (M): 126.81
==========================================================================================
Input size (MB): 0.08
Forward/backward pass size (MB): 3.93
Params size (MB): 0.05
Estimated Total Size (MB): 4.06
==========================================================================================

Can I use the same pooling layer (nn.AvgPool1d) twisc or do I need to create a new one (nn.AvgPool2d) ?

You can reuse the same layer and don’t need to initialize a new object for each call.