I'm in need of help on how to print the values in the loop, as they pass through the ReLU3 layer

Hi!

I’m in need of help on how to print the values ​​in the loop, as they pass through the ReLU3 layer. The idea is to save each layer view.

my code:

import torch.nn as nn

import torch.nn.functional as F

import brevitas.nn as qnn

from brevitas.quant import IntBias

from brevitas.inject.enum import ScalingImplType

from brevitas.inject.defaults import Int8ActPerTensorFloatMinMaxInit

class convModel(nn.Module):

def __init__(self):

    super().__init__()

    # Conv2D(in_channels, out_channels, kernel_size, ...)

    self.quantconv1 = qnn.QuantConv2d(3, 32, 3, weight_bit_width=4)

    self.batchnorm1 = nn.BatchNorm2d(32)

    self.quantReLU1 = qnn.QuantReLU(bit_width=4)

    self.quantconv2 = qnn.QuantConv2d(32, 64, 3, weight_bit_width=4)

    self.batchnorm2 = nn.BatchNorm2d(64)

    self.quantReLU2 = qnn.QuantReLU(bit_width=4)

    self.quantconv3 = qnn.QuantConv2d(64, 128, 3, weight_bit_width=4)

    self.batchnorm3 = nn.BatchNorm2d(128)

    self.quantReLU3 = qnn.QuantReLU(bit_width=4)

    self.quantconv4 = qnn.QuantConv2d(128, 128, 3, weight_bit_width=4)

    self.batchnorm4 = nn.BatchNorm2d(128)

    self.quantReLU4 = qnn.QuantReLU(bit_width=4)

    self.quantconv5 = qnn.QuantConv2d(128, 128, 3, weight_bit_width=4)

    self.batchnorm5 = nn.BatchNorm2d(128)

    self.quantReLU5 = qnn.QuantReLU(bit_width=4)

    

    # MaxPool2D(kernel_size, stride, ...)

    self.quantmaxpool = qnn.QuantMaxPool2d(2, 2)

    self.dropout1 = qnn.Dropout2d(0.2)

    self.dropout2 = qnn.Dropout2d(0.4)

    

    self.quantfc1 = qnn.QuantLinear(4 * 4 * 128, 512)

    self.quantfc2 = qnn.QuantLinear(512, 256)

    self.quantfc3 = qnn.QuantLinear(256, 84)

    self.quantfc4 = qnn.QuantLinear(84, 10)

    

def forward(self, x):

    x = F.QuantReLU(self.batchnorm1(self.quantconv1(x)))              # 32 filtros

    x = F.relu(self.batchnorm2(self.quantconv2(x)))                   # 64 filtros

    x = self.quantmaxpool(self.batchnorm3(self.quantconv3(x)))        # 128 filtros

    x = F.QuantReLU(x)

    x = F.QuantReLU(self.batchnorm4(self.quantconv4(x)))              # 128 filtros

    x = self.quantmaxpool(self.batchnorm5(self.quantconv5(x)))        # 128 filtros

    x = F.QuantReLU(x)

    

    x = torch.flatten(x, 1)

    x = F.QuantReLU(self.quantfc1(x))

    x = quantdropout2(x)                                              # 40%

    x = F.QuantReLU(self.quantfc2(x))

    x = quantdropout1(x)

    x = F.QuantReLU(self.quantfc3(x))

    x = quantdropout1(x)

    x = quantdropout4(x)

    

    return x

I’m not sure I understand the question completely. It seem you would like to print the outputs of self.quantReLU3 but this layer doesn’t seem to be used at all in the forward method.
Could you explain the use case a bit more and which value should be printed?

Good Morning @ptrblck

Let me explain my project:

My project about Split Learning.

I created a CNN with Dataset Cifar10, after that I did a quantization with brevitas. The next step now is to create a variable that each batch will store the output data from ReLU3. I need to do this in the middle of the loop and go to print as the values pass through the chosen bed.