Extract ReLU outputs of EfficientNet and Incetption

I want to recreate this paper and I was wondering how do I extract ReLU outputs of EfficientNet and Inception models?

I read about the models and ReLU isn’t an explicit layer but a part of the convolution layer. I want to confirm if just using the CNN output would be appropriate for this purpose.

Moreover, EfficientNet uses SiLU instead of ReLU. So what should I do in that case?

Let’s break this down:

  • Extracting ReLU Outputs: In models like EfficientNet and Inception, ReLU (Rectified Linear Unit) is typically integrated within the convolutional layers. To extract the outputs after ReLU, you can use the output of the convolutional layers directly, as the ReLU activation function is applied within these layers1.
  • EfficientNet and SiLU: You’re correct! EfficientNet uses the SiLU (Sigmoid Linear Unit) activation function instead of ReLU1. If you need to extract the outputs after the activation function, you should use the output of the SiLU layer instead.

Best regards,
DianaP.

In the paper, they mentioned that they extracted the ReLU outputs of the EfficientNet model. Why did they mention that? Is there any specific version that uses ReLU?

And regarding Inception, if I extract the CNN outputs, they would automatically be outputs of ReLU activation?