Can we use activation layer before Global average pooling?

Hello,I am trying to get 99.4% accuracy in mnist dataset.I tried using different architectures but on one architecture i got 99.4% accuracy.In that one i used a nn.AdaptiveAvgPool2d(1).Here before this layere i used relu layer and send that output as input to this nn.AdaptiveAvgPool2d(1) layer.Can we use relu before global average pooling.
Piece of code is

        x=self.conv7(x)
        x=F.relu(x)
        x=self.avgpool(x)
        x = x.view(-1, 10)
        return F.log_softmax(x)

here self.avgpool = nn.AdaptiveAvgPool2d(1)

You can use ReLU before adaptive average pool. Check the well known architectures like ResNet and SqueezeNet. They are using adaptive average pool at the end.

1 Like

Thank you prasad for answering